Search Results

Search found 22992 results on 920 pages for 'custom pages'.

Page 342/920 | < Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >

  • Oracle Magazine, May/June 2009

    Oracle Magazine May/June features articles on Developer solutions, Oracle and Windows support for midsize businesses, application testing solutions, custom frameworks, ODP.NET transactions, managing literal values with PL/SQL, modernizing Oracle Forms, customizing Oracle Application Express, improving performance in Oracle Database 11g, Tom Kyte answering your questions and much more.

    Read the article

  • Teaching "web design/development" to high-school home-school group. Good sources?

    - by anonymous coward
    I may soon begin teaching a "web design and development" class for a home-school co-op group. Any suggestions for "course" material? My first thought was to work through the Opera Web Standards Curriculum, but am interested in hearing about possible alternatives or suggestions that better cover the "very basics" of getting started with designing and developing web pages. Not necessarily looking for topics, so much as existing resources. Thanks so much for your input!

    Read the article

  • property rental / availability & booking component for asp.net website [closed]

    - by Karl Cassar
    We have a website which contains various listings of properties. Some of these properties can be rented, and we would like to add a 'booking engine' to it, to manage availability and bookings. However, I don't think it would be feasible to custom-code one for just this website. Is there any component / module which one can integrate with, to provide such functionality? Website is developed in C#/ASP.Net.

    Read the article

  • What is required for a scope in an injection framework?

    - by johncarl
    Working with libraries like Seam, Guice and Spring I have become accustomed to dealing with variables within a scope. These libraries give you a handful of scopes and allow you to define your own. This is a very handy pattern for dealing with variable lifecycles and dependency injection. I have been trying to identify where scoping is the proper solution, or where another solution is more appropriate (context variable, singleton, etc). I have found that if the scope lifecycle is not well defined it is very difficult and often failure prone to manage injections in this way. I have searched on this topic but have found little discussion on the pattern. Is there some good articles discussing where to use scoping and what are required/suggested prerequisites for scoping? I interested in both reference discussion or your view on what is required or suggested for a proper scope implementation. Keep in mind that I am referring to scoping as a general idea, this includes things like globally scoped singletons, request or session scoped web variable, conversation scopes, and others. Edit: Some simple background on custom scopes: Google Guice custom scope Some definitions relevant to above: “scoping” - A set of requirements that define what objects get injected at what time. A simple example of this is Thread scope, based on a ThreadLocal. This scope would inject a variable based on what thread instantiated the class. Here's an example of this: “context variable” - A repository passed from one object to another holding relevant variables. Much like scoping this is a more brute force way of accessing variables based on the calling code. Example: methodOne(Context context){ methodTwo(context); } methodTwo(Context context){ ... //same context as method one, if called from method one } “globally scoped singleton” - Following the singleton pattern, there is one object per application instance. This applies to scopes because there is a basic lifecycle to this object: there is only one of these objects instantiated. Here's an example of a JSR330 Singleton scoped object: @Singleton public void SingletonExample{ ... } usage: public class One { @Inject SingeltonExample example1; } public class Two { @Inject SingeltonExample example2; } After instantiation: one.example1 == two.example2 //true;

    Read the article

  • how to localize new posts in asp.net...?? [closed]

    - by ntechi
    I am doing my final year project and have decided to make a website in asp.net. For that I'll be using Micrsoft Visual Studio 2008. I'm making a Real ESTATE properties website. I want to know how to localize or create new posts in asp.net( like in WORDPRESS) and also when I hit SEARCH it should search for the desired keyword or the searched post. If post is not possible then it should display pages...

    Read the article

  • How to Use Quick Toggles on Your Android Phone

    - by Chris Hoffman
    One of the big new features in Apple’s iOS 7 is Control Center, which allows you to quickly access and toggle common setting from anywhere. However, Android phones have had quick toggles for a long time. Android now has its own built-in quick toggles, while popular manufacturer-customized interfaces like Samsung’s TouchWiz have their own quick toggles, which work differently. You can also add custom quick toggles in different places.    

    Read the article

  • Creating Ubuntu Live CD with customized Unity launcher

    - by toros
    I would like to create a custom Ubuntu image based on Natty using Ubuntu Customization Kit. I also want to customize the icons appearing on the Unity Launcher. I can change the icons on my desktop system with the following command: gsettings set com.canonical.Unity.Launcher favorites "['firefox.desktop', 'nautilus-home.desktop', 'libreoffice-writer.desktop']" I tried to run this command from the UCK console while creating the Live CD, but it doesn’t seem to work. Do you have any ideas how I could solve this?

    Read the article

  • Protect and Improve your Software with SmartAssembly 5

    - by Bart Read
    SmartAssembly 5 has been released. You can download a 14-day fully-functional free trial from: http://www.red-gate.com/products/smartassembly/index.htm This is the first major release since Red Gate acquired the tool last year, and our focus has mainly been on improving the quality of an already great tool. We've also simplified the licensing model so that there are now only three editions: Standard - bullet-proof protection at a bargain price, Pro - includes the SDK & custom web server...(read more)

    Read the article

  • Desktop Fun: Beaches Wallpaper Collection Series 2

    - by Asian Angel
    The sun is shining and the waves are gently rolling in as a light wind caresses the beach and all that resides there. Indulge in this classic vacation destination on your desktop with the second in our series of Beaches Wallpaper collections. How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows? Screenshot Tour: XBMC 11 Eden Rocks Improved iOS Support, AirPlay, and Even a Custom XBMC OS

    Read the article

  • WordPress Plugins to Help Make Your Site Responsive

    - by Ravish
    Ultimate Coming Soon Page Ultimate Coming Soon Page plugin allows you quickly and easily set up a coming update page for your website. It has includes some feature like completely customizable with background color and image, add custom CSS and HTML, collect emails, option to stretch background image according to browser etc. WP Orbit Slider [...] Related posts:Responsive WordPress Theme Eleven40 by Studiopress 10 Useful Admin WordPress Plugins 15 Useful SEO Plugins For WordPress

    Read the article

  • Google Analytics - comparing metrics for different cities approach

    - by crmpicco
    I receive traffic from a number of different cities across the world, these being: Washington, Bratislava and Belfast. In Google Analytics, I would like to be able to compare a variety of metrics (side by side), however i'm not sure how to go about this in the best way. Am I looking at creating 3 advanced segments, 3 profiles or should I be doing it in one custom report? Or is this even possible in Google Analytics version 5?

    Read the article

  • The Best How-To Geek Articles for March 2012

    - by Asian Angel
    March was a busy month here at HTG where we covered topics such as properly scanning photos (and getting better images), the best tips for securing your data, identifying network abuse with Wireshark, and more. Join us as we look back at the most popular articles from this past month. How to Own Your Own Website (Even If You Can’t Build One) Pt 1 What’s the Difference Between Sleep and Hibernate in Windows? Screenshot Tour: XBMC 11 Eden Rocks Improved iOS Support, AirPlay, and Even a Custom XBMC OS

    Read the article

  • 8 Link Building Mistakes

    If you are now running a website, you must be known the importance of link building for a website or weblog. The link means backlink, it's a link which pointed to your website or web page from internal or external pages. An SEO guru just pointed out 8 link building mistakes should be avoid when Optimized your sites.

    Read the article

  • Avoiding Duplicate Content Penalties on a Corporate/Franchise website

    - by heath
    My question is really an extension of a previous question that was ported from stackoverflow and closed so I cannot edit it. The basic gist is a regional franchise company has decided to force all independent stores into one website look; they currently all have their own domains and completely different websites. After reading the helpful answers and looking over some links provided, I think my solution is to put a 301 on each franchise store site (acme-store1.com, acme-store2.com, etc) back to the main corporate site (acme.com). All of the company history, product info, etc (about 90% of the entire site) applies to all stores. However, each store should have some exclusive content such as staff, location pictures, exclusive events and promotions, etc. I originally thought that I would simply do something like acme.com/store1/staff, acme.com/store2/staff, etc for the store exclusive content and then acme.com/our-company, for example, would cover all stores. However, I now see two issues that I don't know how to solve. They want to see site stats based on what store site they came from. If a user comes from acme-store1.com, is redirected to acme.com and hits several pages, don't I need to somehow keep that original site in the new url to track each page in that user's session and show they originally came from acme-store1.com? Each store is still independently owned and is essentially still in competition with the other stores, albeit, in less competition than they are with other brands. This is important because each store would like THEIR contact info, links to their social media pages, their mailing list sign-up and customer requests on EVERY page. So if a user originally goes to acme-store1.com and is redirected to acme.com, it still should look to the user that it's all about store 1, even though 90% of the content will be exactly the same as it is in the store 2, store 3 and corporate site. For example, acme.com/our-company would have the same company history, same header/footer/navigation, BUT depending on the original site the user came from, it would display contact and links to THAT store. If someone came directly to the corporate site, it would display their contact and links (they have their own as well). I was considering that all redirects would be to store1.acme.com, store2.acme.com, etc (or acme.com/store1) and then I can dynamically add the contact info and appropriate links based on the subdomain or subfolder. But, then I have to worry about duplicate content penalties because, again, about 90% of the text in these "subdomains" are all the same. For reference, this is a PHP5 site. I've already written a compact framework utilizing templates and mod-rewrite that I've used for other sites. Is this an easy fix that I'm just not grasping? Any suggestions?

    Read the article

  • Getting user generated content with no titles to rank

    - by hugo
    We are creating a site that allows users to generate content. The user is provided with a text field only (no title), similar to Twitter, Facebook, and Google+. Each piece of content created by the users will have a dedicated page/URL. Since the page has no title, I was wondering how search engines will index and display our pages. If the content was shared on other social networks, what will those results look like if there is no title for the open graph or Twitter tags?

    Read the article

  • Less Can Be More In E-Commerce

    - by Michael Hylton
    Today’s consumers are inundated with product choices and vendors. Visit your favorite electronics retailer and see the vast assortment of flat panel televisions. Or the variety of detergents at the supermarket. All of this can be daunting for the average consumer who is looking for the products and services that interest them.  In a study titled “Choice is Demotivating: Can One Desire Too Much of a Good Thing”, the author, Sheena Iyengar found that participants actually reported greater subsequent satisfaction with their selections and wrote better essays when their original set of options had been limited. The same can be said for e-commerce and your website. Being able to quickly convert shoppers into buyers with effective merchandising is what makes leading businesses successful. You want to engage each individual visitor with the most-relevant content to drive higher conversions and order values while decreasing abandonment, but predicting what will resonate with each customer is difficult. In a world of choices, online merchandizing tools can help personalize, streamline, and refine what your customers view when they browse your online catalog. The key to being effective is to align your products and content as closely as possible with the customer’s needs. The goal on the home page is to promote your brand and push visitors farther into the site. The home page is often the starting point for repeat customers as well as for new visitors hoping to address their current product needs. As the customer selects different filters and narrows the choices, valuable information is being provided to the retailer about the customer’s current need—regardless of previous search behavior or what other customers with a similar demographic profile have purchased. Together with search pages, category browse pages are among the primary options available to customers as a means of finding products on your site. Once a customer reaches the product detail page, it is clear what that person desires, regardless of the segment the customer falls into. However, don’t disregard campaign-based promotions completely. A campaign targeted to all customers but featuring rule-driven promotions tied to the product can be effective. Click here to learn more about merchandizing techniques so what your customer sees if half full and not half empty.

    Read the article

  • Less Can Be More In E-Commerce

    - by Michael Hylton
    Today’s consumers are inundated with product choices and vendors. Visit your favorite electronics retailer and see the vast assortment of flat panel televisions. Or the variety of detergents at the supermarket. All of this can be daunting for the average consumer who is looking for the products and services that interest them.  In a study titled “Choice is Demotivating: Can One Desire Too Much of a Good Thing”, the author, Sheena Iyengar found that participants actually reported greater subsequent satisfaction with their selections and wrote better essays when their original set of options had been limited. The same can be said for e-commerce and your website. Being able to quickly convert shoppers into buyers with effective merchandising is what makes leading businesses successful. You want to engage each individual visitor with the most-relevant content to drive higher conversions and order values while decreasing abandonment, but predicting what will resonate with each customer is difficult. In a world of choices, online merchandizing tools can help personalize, streamline, and refine what your customers view when they browse your online catalog. The key to being effective is to align your products and content as closely as possible with the customer’s needs. The goal on the home page is to promote your brand and push visitors farther into the site. The home page is often the starting point for repeat customers as well as for new visitors hoping to address their current product needs. As the customer selects different filters and narrows the choices, valuable information is being provided to the retailer about the customer’s current need—regardless of previous search behavior or what other customers with a similar demographic profile have purchased. Together with search pages, category browse pages are among the primary options available to customers as a means of finding products on your site. Once a customer reaches the product detail page, it is clear what that person desires, regardless of the segment the customer falls into. However, don’t disregard campaign-based promotions completely. A campaign targeted to all customers but featuring rule-driven promotions tied to the product can be effective. Click here to learn more about merchandizing techniques so what your customer sees if half full and not half empty.

    Read the article

  • Besxt Text-to-Speech Solution for my Website

    - by Tim Marshall
    I'm working on the 'Ease of Access' section of my website with the options to increase the font-size displayed on pages to a minimum, invert colours and whatnot. I wish to implement a plugin which, if enabled by the user, to read content on my website. Presumably my best option is a website plugin, however there might be some programming I've not come across which allows the likes of PHP to read content. I'm not entirely sure how this all works. Best Regards, Tim

    Read the article

  • Some Oracle VM 3 updates

    - by wcoekaer
    Today we did another patch set update for Oracle VM 3 (3.0.3-build 227). This can be downloaded from My Oracle Support as patch ID 14736185. There are quite a few updates in here and I highly recommend any Oracle VM 3 customer or user to install this update. This patch can be installed on top of Oracle VM 3.0 versions 3.0.2 and 3.0.3. The patch is cumulative for 3.0.3. So if you already installed patch update 1 (3.0.3-150) then this will just be incremental on top of that and brings you to 3.0.3-build 227. There is a readme file which contains the patchlist in the patch info. The following patches are released on ULN for Oracle VM server 3.0 : initscripts-8.45.30-2.100.18.el5.x86_64 The inittab file and the /etc/init.d scripts. kernel-ovs-2.6.32.21-45.6.x86_64 The Linux kernel kernel-ovs-firmware-2.6.32.21-45.6.x86_64 Firmware files used by the Linux kernel osc-oracle-ocfs2-0.1.0-35.el5.noarch Oracle Storage Connect ocfs2 Plugin osc-plugin-manager-1.2.8-9.el5.3.noarch Oracle Storage Connect Plugin Infrastructure osc-plugin-manager-devel-1.2.8-9.el5.3.noarch Oracle Storage Connect Plugin Development ovs-agent-3.0.3-41.6.x86_64 Agent for Oracle VM xen-4.0.0-81.el5.1.x86_64 Xen is a virtual machine monitor xen-devel-4.0.0-81.el5.1.x86_64 Development libraries for Xen tools xen-tools-4.0.0-81.el5.1.x86_64 Various tooling for the manipulation of Xen instances Errata emails will be sent in the next few days with details on the above updates. Or you will find them here. I also did an update of my Oracle VM utilities to 0.4.0. They are also available from My Oracle Support, patch ID 14736239. These utils can be unzipped and installed on the server running Oracle VM Manager. Typically in /u01/app/oracle/ovm-manager-3/ovm_utils. There is a set of man pages in /u01/app/oracle/ovm-manager-3/ovm_utils/man/man8. There now are 6 commands : ovm_vmcontrol : VM level operations ovm_servercontrol : server level operations ovm_vmdisks : virtual disk/physical location mapping for VM disks ovm_vmmessage : message passing utility between the manager and the VM tools (in the Oracle VM templates) ovm_repocontrol : repository level operations ovm_poolcontrol : pool level operations Some of the new changes : at a pool level, acknowledge events and cascade to servers and virtual machines with outstanding events at a pool level, do a rescan of the storage for fibrechannel/iscsi disks if you add new devices (it does this operation then on every running server) at a repository level, fixup a device if it had a failed create repository at a repository level, refresh the repository and this will update the free space in the UI for ocfs2 repositories at a server level, acknowledge server events and cascade to virtual machines if needed at a VM level, acknowledge VM events at a VM level, bind vcpus to cores with vcpuset/vcpuget Please see the man pages and remember that these tools are just written As Is - no SRs... (per the documentation) Hopefully they are useful.

    Read the article

  • Using IIS Logs for Performance Testing with Visual Studio

    - by Tarun Arora
    In this blog post I’ll show you how you can play back the IIS Logs in Visual Studio to automatically generate the web performance tests. You can also download the sample solution I am demo-ing in the blog post. Introduction Performance testing is as important for new websites as it is for evolving websites. If you already have your website running in production you could mine the information available in IIS logs to analyse the dense zones (most used pages) and performance test those pages rather than wasting time testing & tuning the least used pages in your application. What are IIS Logs To help with server use and analysis, IIS is integrated with several types of log files. These log file formats provide information on a range of websites and specific statistics, including Internet Protocol (IP) addresses, user information and site visits as well as dates, times and queries. If you are using IIS 7 and above you will find the log files in the following directory C:\Interpub\Logs\ Walkthrough 1. Download and Install Log Parser from the Microsoft download Centre. You should see the LogParser.dll in the install folder, the default install location is C:\Program Files (x86)\Log Parser 2.2. LogParser.dll gives us a library to query the iis log files programmatically. By the way if you haven’t used Log Parser in the past, it is a is a powerful, versatile tool that provides universal query access to text-based data such as log files, XML files and CSV files, as well as key data sources on the Windows operating system such as the Event Log, the Registry, the file system, and Active Directory. More details… 2. Create a new test project in Visual Studio. Let’s call it IISLogsToWebPerfTestDemo.   3.  Delete the UnitTest1.cs class that gets created by default. Right click the solution and add a project of type class library, name it, IISLogsToWebPerfTestEngine. Delete the default class Program.cs that gets created with the project. 4. Under the IISLogsToWebPerfTestEngine project add a reference to Microsoft.VisualStudio.QualityTools.WebTestFramework – c:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\PublicAssemblies\Microsoft.VisualStudio.QualityTools.WebTestFramework.dll LogParser also called MSUtil - c:\users\tarora\documents\visual studio 2010\Projects\IisLogsToWebPerfTest\IisLogsToWebPerfTestEngine\obj\Debug\Interop.MSUtil.dll 5. Right click IISLogsToWebPerfTestEngine project and add a new classes – IISLogReader.cs The IISLogReader class queries the iis logs using the log parser. using System; using System.Collections.Generic; using System.Text; using MSUtil; using LogQuery = MSUtil.LogQueryClassClass; using IISLogInputFormat = MSUtil.COMIISW3CInputContextClassClass; using LogRecordSet = MSUtil.ILogRecordset; using Microsoft.VisualStudio.TestTools.WebTesting; using System.Diagnostics; namespace IisLogsToWebPerfTestEngine { // By making use of log parser it is possible to query the iis log using select queries public class IISLogReader { private string _iisLogPath; public IISLogReader(string iisLogPath) { _iisLogPath = iisLogPath; } public IEnumerable<WebTestRequest> GetRequests() { LogQuery logQuery = new LogQuery(); IISLogInputFormat iisInputFormat = new IISLogInputFormat(); // currently these columns give us suffient information to construct the web test requests string query = @"SELECT s-ip, s-port, cs-method, cs-uri-stem, cs-uri-query FROM " + _iisLogPath; LogRecordSet recordSet = logQuery.Execute(query, iisInputFormat); // Apply a bit of transformation while (!recordSet.atEnd()) { ILogRecord record = recordSet.getRecord(); if (record.getValueEx("cs-method").ToString() == "GET") { string server = record.getValueEx("s-ip").ToString(); string path = record.getValueEx("cs-uri-stem").ToString(); string querystring = record.getValueEx("cs-uri-query").ToString(); StringBuilder urlBuilder = new StringBuilder(); urlBuilder.Append("http://"); urlBuilder.Append(server); urlBuilder.Append(path); if (!String.IsNullOrEmpty(querystring)) { urlBuilder.Append("?"); urlBuilder.Append(querystring); } // You could make substitutions by introducing parameterized web tests. WebTestRequest request = new WebTestRequest(urlBuilder.ToString()); Debug.WriteLine(request.UrlWithQueryString); yield return request; } recordSet.moveNext(); } Console.WriteLine(" That's it! Closing the reader"); recordSet.close(); } } }   6. Connect the dots by adding the project reference ‘IisLogsToWebPerfTestEngine’ to ‘IisLogsToWebPerfTest’. Right click the ‘IisLogsToWebPerfTest’ project and add a new class ‘WebTest1Coded.cs’ The WebTest1Coded.cs inherits from the WebTest class. By overriding the GetRequestMethod we can inject the log files to the IISLogReader class which uses Log parser to query the log file and extract the web requests to generate the web test request which is yielded back for play back when the test is run. namespace IisLogsToWebPerfTest { using System; using System.Collections.Generic; using System.Text; using Microsoft.VisualStudio.TestTools.WebTesting; using Microsoft.VisualStudio.TestTools.WebTesting.Rules; using IisLogsToWebPerfTestEngine; // This class is a coded web performance test implementation, that simply passes // the path of the iis logs to the IisLogReader class which does the heavy // lifting of reading the contents of the log file and converting them to tests. // You could have multiple such classes that inherit from WebTest and implement // GetRequestEnumerator Method and pass differnt log files for different tests. public class WebTest1Coded : WebTest { public WebTest1Coded() { this.PreAuthenticate = true; } public override IEnumerator<WebTestRequest> GetRequestEnumerator() { // substitute the highlighted path with the path of the iis log file IISLogReader reader = new IISLogReader(@"C:\Demo\iisLog1.log"); foreach (WebTestRequest request in reader.GetRequests()) { yield return request; } } } }   7. Its time to fire the test off and see the iis log playback as a web performance test. From the Test menu choose Test View Window you should be able to see the WebTest1Coded test show up. Highlight the test and press Run selection (you can also debug the test in case you face any failures during test execution). 8. Optionally you can create a Load Test by keeping ‘WebTest1Coded’ as the base test. Conclusion You have just helped your testing team, you now have become the coolest developer in your organization! Jokes apart, log parser and web performance test together allow you to save a lot of time by not having to worry about what to test or even worrying about how to record the test. If you haven’t already, download the solution from here. You can take this to the next level by using LogParser to extract the log files as part of an end of day batch to a database. See the usage trends by user this solution over a longer term and have your tests consume the web requests now stored in the database to generate the web performance tests. If you like the post, don’t forget to share … Keep RocKiNg!

    Read the article

  • TuxOnIce-problem: "CPU stuck for 22 s"

    - by Lester
    I use Ubuntu 12.04 LTS on a Lenovo Thinkpad T61 and am new to Linux. I installed TuxOnIce following this Guide. Actually it works but resuming takes very long. Shortly before resuming is complete I see some message flashing in command-line like "CPU stuck for 22 s". Some googling brought up pages like this but it did not help me solve me problem. I suppose being absolutely new to Linux is the biggest part of the problem.

    Read the article

  • SQL Server Developer Tools &ndash; Codename Juneau vs. Red-Gate SQL Source Control

    - by Ajarn Mark Caldwell
    So how do the new SQL Server Developer Tools (previously code-named Juneau) stack up against SQL Source Control?  Read on to find out. At the PASS Community Summit a couple of weeks ago, it was announced that the previously code-named Juneau software would be released under the name of SQL Server Developer Tools with the release of SQL Server 2012.  This replacement for Database Projects in Visual Studio (also known in a former life as Data Dude) has some great new features.  I won’t attempt to describe them all here, but I will applaud Microsoft for making major improvements.  One of my favorite changes is the way database elements are broken down.  Previously every little thing was in its own file.  For example, indexes were each in their own file.  I always hated that.  Now, SSDT uses a pattern similar to Red-Gate’s and puts the indexes and keys into the same file as the overall table definition. Of course there are really cool features to keep your database model in sync with the actual source scripts, and the rename refactoring feature is now touted as being more than just a search and replace, but rather a “semantic-aware” search and replace.  Funny, it reminds me of SQL Prompt’s Smart Rename feature.  But I’m not writing this just to criticize Microsoft and argue that they are late to the party with this feature set.  Instead, I do see it as a viable alternative for folks who want all of their source code to be version controlled, but there are a couple of key trade-offs that you need to know about when you choose which tool set to use. First, the basics Both tool sets integrate with a wide variety of source control systems including the most popular: Subversion, GIT, Vault, and Team Foundation Server.  Both tools have integrated functionality to produce objects to upgrade your target database when you are ready (DACPACs in SSDT, integration with SQL Compare for SQL Source Control).  If you regularly live in Visual Studio or the Business Intelligence Development Studio (BIDS) then SSDT will likely be comfortable for you.  Like BIDS, SSDT is a Visual Studio Project Type that comes with SQL Server, and if you don’t already have Visual Studio installed, it will install the shell for you.  If you already have Visual Studio 2010 installed, then it will just add this as an available project type.  On the other hand, if you regularly live in SQL Server Management Studio (SSMS) then you will really enjoy the SQL Source Control integration from within SSMS.  Both tool sets store their database model in script files.  In SSDT, these are on your file system like other source files; in SQL Source Control, these are stored in the folder structure in your source control system, and you can always GET them to your file system if you want to browse them directly. For me, the key differentiating factors are 1) a single, unified check-in, and 2) migration scripts.  How you value those two features will likely make your decision for you. Unified Check-In If you do a continuous-integration (CI) style of development that triggers an automated build with unit testing on every check-in of source code, and you use Visual Studio for the rest of your development, then you will want to really consider SSDT.  Because it is just another project in Visual Studio, it can be added to your existing Solution, and you can then do a complete, or unified single check-in of all changes whether they are application or database changes.  This is simply not possible with SQL Source Control because it is in a different development tool (SSMS instead of Visual Studio) and there is no way to do one unified check-in between the two.  You CAN do really fast back-to-back check-ins, but there is the possibility that the automated build that is triggered from the first check-in will cause your unit tests to fail and the CI tool to report that you broke the build.  Of course, the automated build that is triggered from the second check-in which contains the “other half” of your changes should pass and so the amount of time that the build was broken may be very, very short, but if that is very, very important to you, then SQL Source Control just won’t work; you’ll have to use SSDT. Refactoring and Migrations If you work on a mature system, or on a not-so-mature but also not-so-well-designed system, where you want to refactor the database schema as you go along, but you can’t have data suddenly disappearing from your target system, then you’ll probably want to go with SQL Source Control.  As I wrote previously, there are a number of changes which you can make to your database that the comparison tools (both from Microsoft and Red Gate) simply cannot handle without the possibility (or probability) of data loss.  Currently, SSDT only offers you the ability to inject PRE and POST custom deployment scripts.  There is no way to insert your own script in the middle to override the default behavior of the tool.  In version 3.0 of SQL Source Control (Early Access version now available) you have that ability to create your own custom migration script to take the place of the commands that the tool would have done, and ensure the preservation of your data.  Or, even if the default tool behavior would have worked, but you simply know a better way then you can take control and do things your way instead of theirs. You Decide In the environment I work in, our automated builds are not triggered off of check-ins, but off of the clock (currently once per night) and so there is no point at which the automated build and unit tests will be triggered without having both sides of the development effort already checked-in.  Therefore having a unified check-in, while handy, is not critical for us.  As for migration scripts, these are critically important to us.  We do a lot of new development on systems that have already been in production for years, and it is not uncommon for us to need to do a refactoring of the database.  Because of the maturity of the existing system, that often involves data migrations or other additional SQL tasks that the comparison tools just can’t detect on their own.  Therefore, the ability to create a custom migration script to override the tool’s default behavior is very important to us.  And so, you can see why we will continue to use Red Gate SQL Source Control for the foreseeable future.

    Read the article

  • Multiple domain links on Google from one WordPress site

    - by user557318
    At present when I Google the domain name of the WordPress sites I have worked on, I receive at least three listings (often the top three). The first listing is the only one I am interested in seeing, others appear from individual pages from that WordPress site i.e.: 1st hit - www.domain.com 2nd hit - www.domain.com/about 3rd hit - www.domain.com/designers Does anybody know if its possible to remove all the links but the absolute www.domain.com?

    Read the article

  • Never update systems tables directly - a study in Agent job scheduling

    It is often recommended that system tables should not be updated directly. Presenting a case in point built around nightly job configuration in order to demonstrate the possible issues with updating system tables directly. What can SQL Monitor 3.2 monitor?Whatever you think is most important. Use custom metrics to monitor and alert on data that's most important for your environment. Find out more.

    Read the article

< Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >