Search Results

Search found 35343 results on 1414 pages for 'development tools'.

Page 106/1414 | < Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >

  • Amazon EC2 EBS volume scheduled backup/snapshots using puppet / similar tools

    - by Ehrann Mehdan
    I am not a Linux admin, although I wish I was, and I have seen these questions Amazon EC2 Backup Strategy Amazon EC2 + EBS:: Regular backup plan? Simple Backup Strategy for Amazon EC2 instances / volumes? And this suggestion http://alestic.com/2009/09/ec2-consistent-snapshot I tried using command line + crontab (the command line works, but crontab for some reason, doesn't) But I'm still pretty lost, all I want is an automated, rolling backup of my amazon EC2 (EBS) data (by rolling I mean keep 3-4 weeks back, but delete old snapshots as new ones come for cost control) And as things usually go, if there is something that is hard and painful, someone creates a solution for it. My question is simple, is there a way using a tool like Puppet to do it without a painful learning curve? (or via other tools like http://ylastic.com) If yes, how?

    Read the article

  • Open source command line tools for indexing a large number of text files

    - by ergosys
    I'm looking for any open source command line tool or tools which will allow me to index and search a large number of plain text files. Approximate search would be a plus. The tool only needs to print the files that match, although some match context would be useful. A GUI tool isn't useful for my application, nor is anything that searches files one by one (grep for example). I'm basically targeting unix platforms (osx, linux, bsd). EDIT: I'm not interested in any sort of tool that is system-wide, or needs to run in the background. Basically, I want to build an index for a directory tree full of text files and then later be able to search against it. Preferably the index is one or a few files that I can specify the location of. Any ideas?

    Read the article

  • Author Tools for Classroom Material?

    - by user1413
    I'm interested in putting a whole bunch of classroom material online. This material ranges from accounting classes to yoga. I would like to find a simple to use authoring tool that the people who teach the classes can use to put the material online. In other words, I do not want a tool that requires a developer. A person who knows the subject matter and is willing to read the manual should be able to put their material online. At minimum, this tool should allow for text and multi-media to be chained together in a logical form and it should allow quizzes to be created and graded. Even better would be for the tool to have some "smarts" so that subject areas which the student does not understand can be drilled. Even better would be for the tool to have ecommerce built in so that the instructors can charge for the classes. Are there any such tools?

    Read the article

  • Web-based (intranet / non-hosted) timesheet / project tracking tools

    - by warren
    I realize some similar questions have been asked along these lines before, but from reading-through them today, it appears they don't match my use case. I am looking for a web-based, non-hosted time and project tracking tool. I've downloaded Collabtive so far, but am looking for other suggestions, too. My list of requirements: runs on standard LAMP stack non-hosted (ie, there is an option to download and run it on a local server) not a desktop/single-user application easy-to-use - my audience is a mix of technical and non-technical folks easy to maintain - when time for upgrading comes, I'd really like to not have to rebuild the app (a la ./configure ; make ; make install) needs to support multiple users free-form project additions: we don't have a central project management authority (users should be able to add whatever they're working on, not merely from a drop-down) Does anyone here have experience with such tools? It doesn't have to be free.. but free is always nice :)

    Read the article

  • Graphite Running using daemon tools getting defunct

    - by pradeepchhetri
    I am running carbon-cache.py and carbon-aggregator.py using daemon tools. When I made some changes in the storage-schema.conf and tried to restart the carbon-cache.py, I found that it is becoming zombie very frequently. root 3367 3366 0 03:23 pts/1 00:00:00 supervise carbon-aggregator root 3371 3366 0 03:23 pts/1 00:00:00 supervise carbon-cache root 3373 3367 3 03:23 pts/1 00:00:02 /usr/bin/python /usr/bin/carbon-aggregator.py --debug start root 3379 3372 0 03:23 pts/1 00:00:00 multilog t /var/log/multilog/carbon-cache root 3382 3368 0 03:23 pts/1 00:00:00 multilog t /var/log/multilog/carbon-aggregator root 3638 3371 21 03:24 pts/1 00:00:00 [carbon-cache.py] <defunct> Can someone tell me what may be the reason ?

    Read the article

  • Web-based (intranet / non-hosted) timesheet / project tracking tools

    - by warren
    I realize some similar questions have been asked along these lines before, but from reading-through them today, it appears they don't match my use case. I am looking for a web-based, non-hosted time and project tracking tool. I've downloaded Collabtive and Achievo so far, but am looking for other suggestions, too. My list of requirements: runs on standard LAMP stack non-hosted (ie, there is an option to download and run it on a local server) not a desktop/single-user application easy-to-use - my audience is a mix of technical and non-technical folks easy to maintain - when time for upgrading comes, I'd really like to not have to rebuild the app (a la ./configure ; make ; make install) needs to support multiple users free-form project additions: we don't have a central project management authority (users should be able to add whatever they're working on, not merely from a drop-down) Does anyone here have experience with such tools? It doesn't have to be free.. but free is always nice :)

    Read the article

  • How to set up Dell OMSA tools on Debian 6 (Squeeze) (PE2950)

    - by javano
    I assume I have set up them up incorrectly or am missing a library or dependency. When I log into the OMSA web interface I can't see anything Also, omreport tells me nothing; root@box:~# omreport storage controller No controllers found I assume these two will use the same source of information so what ever is wrong will fix them both. I set up OMSA as per these instructions. Also I have compiled MegaCLI (as this is a PowerEdge 2950 with a Perc 6/i controller) and I have used that to update the RAID firmware, so that works, but the Dell tools aren't. What have I missed during set up? root@kvm1:~# cat /etc/issue Debian GNU/Linux 6.0 \n \l root@kvm1:~# uname -a Linux kvm1.mivoice.cust.vostron.net 3.4.9 #1 SMP Wed Aug 22 19:08:46 BST 2012 x86_64 GNU/Linux

    Read the article

  • Tools to monitor guest OS performance in vSphere

    - by Quick Joe Smith
    I am looking for some tool or way to retrieve performance data from guest VMs running under vSphere 4.1. I am currently interested in the 4 basic metrics: CPU(%), Memory(%), Disk availability(%) & Network utilisation(Kb/s). The issue I have is that all of vSphere's performance data is from a ESXi host perspective (active, shared, consumed, overhead, swapped etc.) which is far removed from the data from the VM's own perspective. For instance, I have a Windows server VM idling, using around 410MB (~25% of its allocated 2GB) as reported by Task Manager, and this is the value I'm after. vSphere's metrics seem unable to arrive at this figure by any reliable and repeatable means. Is anyone aware of tools that can obtain this kind of data? The simpler, the better.

    Read the article

  • Tools for analyzing performance of SQL Server/Express?

    - by Adam Crossland
    The application that I have customized and continue to support for my client is seeing dramatic performance problems in the field. Simple queries on rather small datasets take over a minute when I would expect them to complete with sub-second times. My current theory is that SQL Server Express 2005 is too limited for the rather non-trivial demands being made of it, but I am not sure how to get about gathering data that I can use to either prove my point or allow me to move on to finding another cause. Can anyone point me toward some tools that would allow me to analyze the load on this database? Information such as simultaneous connections, execution times of individual queries, memory usage, heck just any profiling data at all would be a help. Many thanks.

    Read the article

  • Tools to automate recording streaming radio

    - by Stan
    Are there any tools that can automate recording online streaming radio? I've been using Total Recorder which has the following useful features: Handy scheduler Supports creating recording templates, so I can customize some high/low quality recording Unfortunately it requires opening the streaming radio in a browser and can't have another sound source at the same time; it's recording what comes out from the speaker. What I am looking for is given an online radio URL, the tool should be able to record the audio stream, no matter if I am playing any other music or not. Does such a tool exist?

    Read the article

  • Lenovo tools for windows 7: can't re-enable wireless

    - by pcampbell
    Consider a netbook - Lenovo S10e with Windows 7 and the S10 Lenovo power management tools. Machine has factory BIOS. Fn+F5 is the key combo to toggle the wireless radio on/off. The tool allows the disabling fine; works as expected. The problem is that the re-enable doesn't work, or is confusing on how to re-enable. Previously tried without success: Fn-F5 Fn-Ctrl-F5 Fn-Shift-F5 Fn-Alt-F5 Here's the onscreen display: Question: How can you re-enable the wireless radio using the Function key on a Lenovo netbook?

    Read the article

  • Tools to manage clusters

    - by Stan
    Say if there're many game servers, is there any tools for engineers to easily manage? Below are some requirements. allow RDP (remote desktop) to servers. has group/permission setting. Classify by different functionality. So for people has permission to access certain group, they don't need further enter pwd to RDP servers, the tool will automatically log on the server. log activities: history about who has log on what server. Thanks.

    Read the article

  • Tools for tracking disk usage

    - by Carey
    I manage a number of linux fileservers. These all run applications written from 0-10 years ago. As sometimes happens, a machine will come close to, or run out of disk space. Reasons include applications not rotating log files, a machine with 500GB of disk producing 150GB of new files every month that were not written to tape, databases gradually increasing in size, people doing silly things...generally a bit of chaos. Anyway, when a machine unexpectedly goes from 50% to 100% full in a couple of hours, I figure out what broke (lots of "du") and delete files or contact someone. I also can look at cacti graphs to figure out what the machine's normal disk usage is (e.g. for /home). Does anyone know of any tools that will give finer grained information on historial usage than a cacti/RRD graph? Like "/home/abc/xyz increased 50GB in the last day".

    Read the article

  • Tools to manage bunches of servers

    - by Stan
    Platform: most of them are Windows Server 2003, some are CentOS 5 Say if there're many game servers, is there any tools for engineers to easily manage? Below are some requirements. allow RDP (remote desktop) to servers. has group/permission setting. Classify by different functionality. So for people has permission to access certain group, they don't need further enter pwd to RDP servers, the tool will automatically log on the server. log activities: history about who has log on what server. Thanks.

    Read the article

  • Java Champion Stephen Chin on New Features and Functionality in JavaFX

    - by janice.heiss(at)oracle.com
    In an Oracle Technology Network interview, Java Champion Stephen Chin, Chief Agile Methodologist for GXS, and one of the most prolific and innovative JavaFX developers, provides an update on the rapidly developing changes in JavaFX.Chin expressed enthusiasm about recent JavaFX developments:"There is a lot to be excited about -- JavaFX has a new API face. All the JavaFX 2.0 APIs will be exposed via Java classes that will make it much easier to integrate Java server and client code. This also opens up some huge possibilities for JVM language integration with JavaFX." Chin also spoke about developments in Visage, the new language project created to fill the gap left by JavaFX Script:"It's a domain-specific language for writing user interfaces, which addresses the needs of UI developers. Visage takes over where JavaFX Script left off, providing a statically typed, declarative language with lots of features to make UI development a pleasure.""My favorite language features from Visage are the object literal syntax for quickly building scene graphs and the bind keyword for connecting your UI to the backend model. However, the language is built for UI development from the top down, including subtle details like null-safe dereferencing for exception-less code."Read the entire article.

    Read the article

  • book about psychology of decision and psychology of human

    - by boos
    I'm a unix developer and i want to make career in project/people management as first step. I think sometimes is better to have good communication skill and in general more human skill to make career more fast. Almost in Italy, a lot of people made career development more fast for his human skill and not for his technical skill. Anyone have read some book about psychology to better manage how people and personality work and to exploit decision making situation in the right way? I have found some interesting book about people personality and psychology of decision, but i am in doubt about the usefulness about reading such book. anyone have some experience in this path ? Anyone have found useful to read similar book about how people work, to manage career development in a more fast way and handle people and decision in a more useful way? i have already read peopleware. The table of content of one of this book have: 1 - Judicment and decision 2 - Euristics and sistematics error 3 - Estimating probability and frequency prediction 4 - Risk and decision 5 - rappresentation and decision 6 - Memory, attention and decision. Etc. what do you think about ?

    Read the article

  • Mercurial Conversion from Team Foundation Server

    - by mhawley
    I’m using Twitter. Follow me @matthawley One of my many (almost) daily tasks when working on the CodePlex platform since releasing Mercurial as a supported version control system, is converting projects from Team Foundation Server (TFS) to Mercurial. I'm happy to say that of all the conversions I have done since mid-January, the success rate of migrating full source history is about 95%. To get to this success point, I have had to learn and refine several techniques utilizing a few different tools… (read more)

    Read the article

  • PowerShell PowerPack Download

    - by BuckWoody
    I read Jeffery Hicks’ article in this month’s Redmond Magazine on a new add-in for Windows PowerShell 2.0. It’s called the PowerShell Pack and it has a some great new features that I plan to put into place on my production systems as soon as I finished learning and testing them. You can download the pack here if you have PowerShell 2.0. I’m having a lot of fun with it, and I’ll blog about what I’m learning here in the near future, but you should check it out. The only issue I have with it right now is that you have to load a module and then use get-help to find out what it does, because I haven’t found a lot of other documentation so far. The most interesting modules for me are the ones that can run a command elevated (in PSUserTools), the task scheduling commands (in TaskScheduler) and the file system checks and tools (in FileSystem). There’s also a way to create simple Graphical User Interface panels (in ). I plan to string all these together to install a management set of tools on my SQL Server Express Instances, giving the user “task buttons” to backup or restore a database, add or delete users and so on. Yes, I’ll be careful, and yes, I’ll make sure the user is allowed to do that. For now, I’m testing the download, but I thought I would share what I’m up to. If you have PowerShell 2.0 and you download the pack, let me know how you use it. Script Disclaimer, for people who need to be told this sort of thing: Never trust any script, including those that you find here, until you understand exactly what it does and how it will act on your systems. Always check the script on a test system or Virtual Machine, not a production system. Yes, there are always multiple ways to do things, and this script may not work in every situation, for everything. It’s just a script, people. All scripts on this site are performed by a professional stunt driver on a closed course. Your mileage may vary. Void where prohibited. Offer good for a limited time only. Keep out of reach of small children. Do not operate heavy machinery while using this script. If you experience blurry vision, indigestion or diarrhea during the operation of this script, see a physician immediately. Share this post: email it! | bookmark it! | digg it! | reddit! | kick it! | live it!

    Read the article

  • Essential roles for web application team

    - by jromero
    Some friends of mine came up with an idea for a web application which we (so far) think could be great. I made the analysis and all the early stages of the development process and I'm about to start the coding. I'm talking about something that is barely a mid-level project, so I consider one developer (myself) should be enough. The thing is that we are trying to assign roles to each one of us so we can be focused on our duties and have clear our responsibilities within the team. We are a crew of four people, three of us (my friends) are business people who would do the marketing, customer relationship, management and accounting stuff and I'm basically the developer. I have in mind to get them involved into the development process by giving them documentation to write and use them as testers, all of that besides the management duties they have. Perhaps someone out there have been in the same situation, so I would appreciate if the experience is shared so we can effectively give ourselves positions in the project based on what I explained above. Which are the essential roles or the optimal team layout so the idea can be developed successfully? The question is not strictly about programming, but it's related to build a software entrepreneurship beyond the code, that is something that I'm sure plenty of us are looking. Any help is really appreciated! Regards.

    Read the article

  • Role of Microsoft certifications ADO.Net, ASP.Net, WPF, WCF and Career?

    - by Steve Johnson
    I am a Microsoft fan and .Net enthusiast. I want to align my career in the lines of current and future .Net technologies. I have an MCTS in ASP.Net 3.5. The question is about the continuation of certifications and my career growth and maybe a different job! I want to keep pace with future Microsoft .Net technologies. My current job however doesn't allow so.So i bid to do .Net based certifications to stay abreast with latest .Net technologies. My questions: What certifications should i follow next? I have MCTS .Net 3.5 WPF(Exam 70-502) and MCTS .Net 3.5 WCF(Exam 70-504) in my mind so that i can go for Silverlight development and seek jobs related to Silverlight development. What other steps i need to take in order to develop professional expertise in technologies such as WPF, WCF and Silverlight when my current employer is reluctant to shift to latest .Net technologies? I am sure that there are a lot of people of around here who are working with .Net technologies and they have industrial experience. I being a new comer and starter in my career need to take right decision and so i am seeking help from this community in guiding me to the right path. Expert replies are much appreciated and thanks in advance. Best Regards Steve.

    Read the article

  • Software Developer Interview Question - Fair or Unfair

    - by user607018
    I just phone interviewed with a company for a graduate software developer position and was asked the following questions. I should add that the company concerned are not a database vendor. How does a query optimiser work? If a database was performing badly how would you use the performance logs to find out the problem. I have asked whether they ask such questions of all candidate software developers (graduate or experienced) in a first phone interview. They replied that they like to test their candidates knowledge of database development. I want to write to the company to say that these questions are unreasonable to ask at a software developer interview and to request that my interview be done over. I would like to check the reasonableness of the following assumptions a) Those questions cannot be fairly classified as database development questions. b) I think the questions are appropriate for a DBA interview but wholly unreasonable for a software developer interview (experienced or not). c) The first question is only relevant to a database vendor. d) The second question is not fair because software developers typically don't deal with database performance logs as that is the job of the DBA. Perhaps some of you will be kind enough to comment on my assumptions or may have any other suggestions, before I write to the company.

    Read the article

  • How do web-developers do web-design when freelancing?

    - by Gerald Blizz
    So I got my first job recently as junior web-developer. My company creates small/medium sites for wide variety of customers: autobusiness companies, weddign agencies, some sauna websites, etcetc, hope you get my point. They don't do big serious stuff like bank systems or really big systems, it's mostly small/medium-sized websites for startups/medium sized business. My main skills are PHP/MySQL, I also know HTML and a bit of CSS/JS/AJAX. I know that good web-developer must know some backend language (like PHP/Ruby/Python) AND HTML+CSS+JS+AJAX+JQuery combo. However, I was always wondering. In my company we have web-designer. In other serious organisations I often see the same stuff: web-developers who create business-logic and web-designers, who create design. As far as I know, after designers paint design of website they give it to developers either in PSD or sliced way, and developers put it together with logic, but design is NOT created by developers. Such separation seems very good for full-time job, but I am concerned with question how do freelance web-developers do websites? Do most of them just pay freelance designers to create design for them? Or do some people do both? Reason why I ask - I plan to start some freelancing in my free time after I get good at web-development. But I don't want to create websites with great business-logic but poor design. Neither I want to let someone else create a design for me. I like web-development very much and I am doing quite good, I like design aswell, even though I am a bit lost how to study it and get better at it. But I am scared that going in both directions won't let me become expert, it seems like two totally different jobs and getting really good in both seems very hard. But I really want to do both. What should I do? Thank you!

    Read the article

  • Using Deployment Manager

    - by Jess Nickson
    One of the teams at Red Gate has been working very hard on a new product: Deployment Manager. Deployment Manager is a free tool that lets you deploy updates to .NET apps, services and databases through a central dashboard. Deployment Manager has been out for a while, but I must admit that even though I work in the same building, until now I hadn’t even looked at it. My job at Red Gate is to develop and maintain some of our community sites, which involves carrying out regular deployments. One of the projects I have to deploy on a fairly regular basis requires me to send my changes to our build server, TeamCity. The output is a Zip file of the build. I then have to go and find this file, copy it across to the staging machine, extract it, and copy some of the sub-folders to other places. In order to keep track of what builds are running, I need to rename the folders accordingly. However, even after all that, I still need to go and update the site and its applications in IIS to point at these new builds. Oh, and then, I have to repeat the process when I deploy on production. Did I mention the multiple configuration files that then need updating as well? Manually? The whole process can take well over half an hour. I’m ready to try out a new process. Deployment Manager is designed to massively simplify the deployment processes from what could be lots of manual copying of files, managing of configuration files, and database upgrades down to a few clicks. It’s a big promise, but I decided to try out this new tool on one of the smaller ASP.NET sites at Red Gate, Format SQL (the result of a Red Gate Down Tools week). I wanted to add some new functionality, but given it was a new site with no set way of doing things, I was reluctant to have to manually copy files around servers. I decided to use this opportunity as a chance to set the site up on Deployment Manager and check out its functionality. What follows is a guide on how to get set up with Deployment Manager, a brief overview of its features, and what I thought of the experience. To follow along with the instructions that follow, you’ll first need to download Deployment Manager from Red Gate. It has a free ‘Starter Edition’ which allows you to create up to 5 projects and agents (machines you deploy to), so it’s really easy to get up and running with a fully-featured version. The Initial Set Up After installing the product and setting it up using the administration tool it provides, I launched Deployment Manager by going to the URL and port I had set it to run on. This loads up the main dashboard. The dashboard does a good job of guiding me through the process of getting started, beginning with a prompt to create some environments. 1. Setting up Environments The dashboard informed me that I needed to add new ‘Environments’, which are essentially ways of grouping the machines you want to deploy to. The environments that get added will show up on the main dashboard. I set up two such environments for this project: ‘staging’ and ‘live’.   2. Add Target Machines Once I had created the environments, I was ready to add ‘target machine’s to them, which are the actual machines that the deployment will occur on.   To enable me to deploy to a new machine, I needed to download and install an Agent on it. The ‘Add target machine’ form on the ‘Environments’ page helpfully provides a link for downloading an Agent.   Once the agent has been installed, it is just a case of copying the server key to the agent, and the agent key to the server, to link them up.   3. Run Health Check If, after adding your new target machine, the ‘Status’ flags an error, it is possible that the Agent and Server keys have not been entered correctly on both Deployment Manager and the Agent service.     You can ‘Check Health’, which will give you more information on any issues. It is probably worth running this regardless of what status the ‘Environments’ dashboard is claiming, just to be on the safe side.     4. Add Projects Going back to the main Dashboard tab at this point, I found that it was telling me that I needed to set up a new project.   I clicked the ‘project’ link to get started, gave my new project a name and clicked ‘Create’. I was then redirected to the ‘Steps’ page for the project under the Projects tab.   5. Package Steps The ‘Steps’ page was fairly empty when it first loaded.   Adding a ‘step’ allowed me to specify what packages I wanted to grab for the deployment. This part requires a NuGet package feed to be set up, which is where Deployment Manager will look for the packages. At Red Gate, we already have one set up, so I just needed to tell Deployment Manager about it. Don’t worry; there is a nice guide included on how to go about doing all of this on the ‘Package Feeds’ page in ‘Settings’, if you need any help with setting these bits up.    At Red Gate we use a build server, TeamCity, which is capable of publishing built projects to the NuGet feed we use. This makes the workflow for Format SQL relatively simple: when I commit a change to the project, the build server is configured to grab those changes, build the project, and spit out a new NuGet package to the Red Gate NuGet package feed. My ‘package step’, therefore, is set up to look for this package on our feed. The final part of package step was simply specifying which machines from what environments I wanted to be able to deploy the project to.     Format SQL Now the main Dashboard showed my new project and environment in a rather empty looking grid. Clicking on my project presented me with a nice little message telling me that I am now ready to create my first release!   Create a release Next I clicked on the ‘Create release’ button in the Projects tab. If your feeds and package step(s) were set up correctly, then Deployment Manager will automatically grab the latest version of the NuGet package that you want to deploy. As you can see here, it was able to pick up the latest build for Format SQL and all I needed to do was enter a version number and description of the release.   As you can see underneath ‘Version number’, it keeps track of what version the previous release was given. Clicking ‘Create’ created the release and redirected me to a summary of it where I could check the details before deploying.   I clicked ‘Deploy this release’ and chose the environment I wanted to deploy to and…that’s it. Deployment Manager went off and deployed it for me.   Once I clicked ‘Deploy release’, Deployment Manager started to automatically update and provide continuing feedback about the process. If any errors do arise, then I can expand the results to see where it went wrong. That’s it, I’m done! Keep in mind, if you hit errors with the deployment itself then it is possible to view the log output to try and determine where these occurred. You can keep expanding the logs to narrow down the problem. The screenshot below is not from my Format SQL deployment, but I thought I’d post one to demonstrate the logging output available. Features One of the best bits of Deployment Manager for me is the ability to very, very easily deploy the same release to multiple machines. Deploying this same release to production was just a case of selecting the deployment and choosing the ‘live’ environment as the place to deploy to. Following on from this is the fact that, as Deployment Manager keeps track of all of your releases, it is extremely easy to roll back to a previous release if anything goes pear-shaped! You can view all your previous releases and select one to re-deploy. I needed this feature more than once when differences in my production and staging machines lead to some odd behavior.     Another option is to use the TeamCity integration available. This enables you to set Deployment Manager up so that it will automatically create releases and deploy these to an environment directly from TeamCity, meaning that you can always see the latest version up and running without having to do anything. Machine Specific Deployments ‘What about custom configuration files?’ I hear you shout. Certainly, it was one of my concerns. Our setup on the staging machine is not in line with that on production. What this means is that, should we deploy the same configuration to both, one of them is going to break. Thankfully, it turns out that Deployment Manager can deal with this. Given I had environments ‘staging’ and ‘live’, and that staging used the project’s web.config file, while production (‘live’) required the config file to undergo some transformations, I simply added a web.live.config file in the project, so that it would be included as part of the NuGet package. In this file, I wrote the XML document transformations I needed and Deployment Manager took care of the rest. Another option is to set up ‘variables’ for your project, which allow you to specify key-value pairs for your configuration file, and which environment to apply them to. You’ll find Variables as a full left-hand submenu within the ‘Projects’ tab. These features will definitely be of interest if you have a large number of environments! There are still many other features that I didn’t get a chance to play around with like running PowerShell scripts for more personalised deployments. Maybe next time! Also, let’s not forget that my use case in this article is a very simple one – deploying a single package. I don’t believe that all projects will be equally as simple, but I already appreciate how much easier Deployment Manager could make my life. I look forward to the possibility of moving our other sites over to Deployment Manager in the near future.   Conclusion In this article I have described the steps involved in setting up and configuring an instance of Deployment Manager, creating a new automated deployment process, and using this to actually carry out a deployment. I’ve tried to mention some of the features I found particularly useful, such as error logging, easy release management allowing you to deploy the same release multiple times, and configuration file transformations. If I had to point out one issue, then it would be that the releases are immutable, which from a development point of view makes sense. However, this causes confusion where I have to create a new release to deploy to a newly set up environment – I cannot simply deploy an old release onto a new environment, the whole release needs to be recreated. I really liked how easy it was to get going with the product. Setting up Format SQL and making a first deployment took very little time. Especially when you compare it to how long it takes me to manually deploy the other site, as I described earlier. I liked how it let me know what I needed to do next, with little messages flagging up that I needed to ‘create environments’ or ‘add some deployment steps’ before I could continue. I found the dashboard incredibly convenient. As the number of projects and environments increase, it might become awkward to try and search them and find out what state they are in. Instead, the dashboard handily keeps track of the latest deployments of each project and lets you know what version is running on each of the environments, and when that deployment occurred. Finally, do you remember my complaint about having to rename folders so that I could keep track of what build they came from? This is yet another thing that Deployment Manager takes care of for you. Each release is put into its own directory, which takes the name of whatever version number that release has, though these can be customised if necessary. If you’d like to take a look at Deployment Manager for yourself, then you can download it here.

    Read the article

  • Automating deployments with the SQL Compare command line

    - by Jonathan Hickford
    In my previous article, “Five Tips to Get Your Organisation Releasing Software Frequently” I looked at how teams can automate processes to speed up release frequency. In this post, I’m looking specifically at automating deployments using the SQL Compare command line. SQL Compare compares SQL Server schemas and deploys the differences. It works very effectively in scenarios where only one deployment target is required – source and target databases are specified, compared, and a change script is automatically generated and applied. But if multiple targets exist, and pressure to increase the frequency of releases builds, this solution quickly becomes unwieldy.   This is where SQL Compare’s command line comes into its own. I’ve put together a PowerShell script that loops through the Servers table and pulls out the server and database, these are then passed to sqlcompare.exe to be used as target parameters. In the example the source database is a scripts folder, a folder structure of scripted-out database objects used by both SQL Source Control and SQL Compare. The script can easily be adapted to use schema snapshots.     -- Create a DeploymentTargets database and a Servers table CREATE DATABASE DeploymentTargets GO USE DeploymentTargets GO CREATE TABLE [dbo].[Servers]( [id] [int] IDENTITY(1,1) NOT NULL, [serverName] [nvarchar](50) NULL, [environment] [nvarchar](50) NULL, [databaseName] [nvarchar](50) NULL, CONSTRAINT [PK_Servers] PRIMARY KEY CLUSTERED ([id] ASC) ) GO -- Now insert your target server and database details INSERT INTO dbo.Servers ( serverName , environment , databaseName) VALUES ( N'myserverinstance' , N'myenvironment1' , N'mydb1') INSERT INTO dbo.Servers ( serverName , environment , databaseName) VALUES ( N'myserverinstance' , N'myenvironment2' , N'mydb2') Here’s the PowerShell script you can adapt for yourself as well. # We're holding the server names and database names that we want to deploy to in a database table. # We need to connect to that server to read these details $serverName = "" $databaseName = "DeploymentTargets" $authentication = "Integrated Security=SSPI" #$authentication = "User Id=xxx;PWD=xxx" # If you are using database authentication instead of Windows authentication. # Path to the scripts folder we want to deploy to the databases $scriptsPath = "SimpleTalk" # Path to SQLCompare.exe $SQLComparePath = "C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe" # Create SQL connection string, and connection $ServerConnectionString = "Data Source=$serverName;Initial Catalog=$databaseName;$authentication" $ServerConnection = new-object system.data.SqlClient.SqlConnection($ServerConnectionString); # Create a Dataset to hold the DataTable $dataSet = new-object "System.Data.DataSet" "ServerList" # Create a query $query = "SET NOCOUNT ON;" $query += "SELECT serverName, environment, databaseName " $query += "FROM dbo.Servers; " # Create a DataAdapter to populate the DataSet with the results $dataAdapter = new-object "System.Data.SqlClient.SqlDataAdapter" ($query, $ServerConnection) $dataAdapter.Fill($dataSet) | Out-Null # Close the connection $ServerConnection.Close() # Populate the DataTable $dataTable = new-object "System.Data.DataTable" "Servers" $dataTable = $dataSet.Tables[0] #For every row in the DataTable $dataTable | FOREACH-OBJECT { "Server Name: $($_.serverName)" "Database Name: $($_.databaseName)" "Environment: $($_.environment)" # Compare the scripts folder to the database and synchronize the database to match # NB. Have set SQL Compare to abort on medium level warnings. $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/AbortOnWarnings:Medium") # + @("/sync" ) # Commented out the 'sync' parameter for safety, write-host $arguments & $SQLComparePath $arguments "Exit Code: $LASTEXITCODE" # Some interesting variations # Check that every database matches a folder. # For example this might be a pre-deployment step to validate everything is at the same baseline state. # Or a post deployment script to validate the deployment worked. # An exit code of 0 means the databases are identical. # # $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/Assertidentical") # Generate a report of the difference between the folder and each database. Generate a SQL update script for each database. # For example use this after the above to generate upgrade scripts for each database # Examine the warnings and the HTML diff report to understand how the script will change objects # #$arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/ScriptFile:update_$($_.environment+"_"+$_.databaseName).sql", "/report:update_$($_.environment+"_"+$_.databaseName).html" , "/reportType:Interactive", "/showWarnings", "/include:Identical") } It’s worth noting that the above example generates the deployment scripts dynamically. This approach should be problem-free for the vast majority of changes, but it is still good practice to review and test a pre-generated deployment script prior to deployment. An alternative approach would be to pre-generate a single deployment script using SQL Compare, and run this en masse to multiple targets programmatically using sqlcmd, or using a tool like SQL Multi Script.  You can use the /ScriptFile, /report, and /showWarnings flags to generate change scripts, difference reports and any warnings.  See the commented out example in the PowerShell: #$arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/ScriptFile:update_$($_.environment+"_"+$_.databaseName).sql", "/report:update_$($_.environment+"_"+$_.databaseName).html" , "/reportType:Interactive", "/showWarnings", "/include:Identical") There is a drawback of running a pre-generated deployment script; it assumes that a given database target hasn’t drifted from its expected state. Often there are (rightly or wrongly) many individuals within an organization who have permissions to alter the production database, and changes can therefore be made outside of the prescribed development processes. The consequence is that at deployment time, the applied script has been validated against a target that no longer represents reality. The solution here would be to add a check for drift prior to running the deployment script. This is achieved by using sqlcompare.exe to compare the target against the expected schema snapshot using the /Assertidentical flag. Should this return any differences (sqlcompare.exe Exit Code 79), a drift report is outputted instead of executing the deployment script.  See the commented out example. # $arguments = @("/scripts1:$($scriptsPath)", "/server2:$($_.serverName)", "/database2:$($_.databaseName)", "/Assertidentical") Any checks and processes that should be undertaken prior to a manual deployment, should also be happen during an automated deployment. You might think about triggering backups prior to deployment – even better, automate the verification of the backup too.   You can use SQL Compare’s command line interface along with PowerShell to automate multiple actions and checks that you need in your deployment process. Automation is a practical solution where multiple targets and a higher release cadence come into play. As we know, with great power comes great responsibility – responsibility to ensure that the necessary checks are made so deployments remain trouble-free.  (The code sample supplied in this post automates the simple dynamic deployment case – if you are considering more advanced automation, e.g. the drift checks, script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have further questions)

    Read the article

  • Improving the speed of writing code in C#

    - by Robert Harvey
    Laugh if you want, but I used to develop substantial line-of-business applications in VB6, long before the .NET framework came along. Why, when I was your age, we used to walk two miles in the snow, uphill. Both ways... Love it or hate it, VB6 had a REPL-like feel, and a very rapid development cycle. I would like to know how to come closer to that process in C#. In VB6, I could write a function, execute it, debug it and have it fully functional in a few minutes. I am told this is how the Lisp crowd works. It's a very rapid-fire style of programming. In C# I write a function, then I write a unit test for that function (which is OK, I understand the value of that), then I right-click, run test, wait for the project to compile (takes about 10 seconds right now, which would be an eternity for a REPL loop), and get an exception. Honestly, this feels more like my junior college days, when I used to feed punch cards into a hopper and wait for a printout (exaggerating only slightly for effect). Additionally, my tendency nowadays is to make everything public while I'm testing it. Unit testing with private accessors works fine, but you can't trace through the code (unless, of course, I'm doing something wrong) while you're using them. So what I'd like to know is, what adjustments have you made to your development process in C# to streamline it, and make it possible to write and verify your code very rapidly?

    Read the article

< Previous Page | 102 103 104 105 106 107 108 109 110 111 112 113  | Next Page >