Search Results

Search found 11862 results on 475 pages for 'gman save the unicorns'.

Page 169/475 | < Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >

  • SEO: Moving articles from one domain to another

    - by Melanie
    Currently I have articles up on a website (site A) that is not mine (but I can remove the articles.) The article's aren't faring well (not only due to the recent Google changes) but because they really could do better if I made some tweaks myself instead of relying on the domain owner's SEO skills. So I would like to set up my own website and have just my articles on it (site B.) In the past when I've moved content, I've set up redirects but this time I can't do that. What would be the best way to move the articles without having to worry about them being counted as duplicate content or any other lame stuff? Should I, A: Save the articles on my computer and remove them from Site A. Wait for Google to remove them from the index (several months.) B: Remove the articles from Site A and immediately place them on Site B.

    Read the article

  • PonyEdit: It’s really fast

    - by Gary Pendergast
    Over the past few months, a friend and I have been hard at work on a new breed of text editor that we call PonyEdit. If you’ve ever found yourself cursing over the lag of working on remote cloud servers, this is the editor for you.It’s not just another SFTP editor…Reading and writing files over SFTP is nothing new; dozens of text editors can do it. But it’s always slow, clunky and feels like the feature was bolted on as an afterthought. You’ll find yourself using separate shortcuts to open files locally vs remotely, and dealing with sometimes painful save times with every edit, no matter how minor.PonyEdit gets rid of this terribly slow method of working by connecting over SSH, and using edit streaming to push changes to the server in the background as-you-type.Head on over to PonyEdit.com to download a free trial, and let me know what you think! Oh, and…Stand by to have your mind blown.

    Read the article

  • Hack Extension Files to Make Them Version-Compatible for Firefox

    - by Asian Angel
    A well known drawback in using Firefox is the problem with extension compatibility when a new major version is released. Whether it is for a new extension that you are trying for the first time or an old favorite we have a way to get those extensions working for you again. There are multiple reasons why you might want to choose this method to fix a non-compatible extension: You are uncomfortable with tweaking the “about:config” settings You prefer to maintain the original “about:config” settings in a pristine state and like having compatibility checking active You are looking to gain some “geek cred” Keep in mind that most extensions will work perfectly well with a new version of Firefox and simply have the “version compatibility number” problem. But once in a while there may be one that needs to have some work done on it by the extension’s author. The Problem Here is a perfect example of everyone’s least favorite “extension message”. This is the last thing that you need when all that you want is for your favorite extension (or a new one) to work on a fresh clean install. Note: This works nicely to “replace” non-compatible extensions already present in your browser if you are simply upgrading. Hacking the XPI File For this procedure you will need to manually download the extension to your hard-drive (right click on the extension’s “Install Button” and select “Save As”). Once you have done that you are ready to start hacking the extension. For our example we chose the “GCal Popup Extension”. The best thing to do is place the extension in a new folder (i.e. the Desktop or other convenient location) then unzip it just the same way that you would with any regular zip file. Once it is unzipped you will see the various folders and files that were in the “xpi file” (we had four files here but depending on the extension the number may vary). There is only one file that you need to focus on…the “install.rdf” file. Note: At this point you should move the original extension file to a different location (i.e. outside of the folder) so that it is no longer present. Open the file in “Notepad” so that you can change the number for the “maxVersion”. Here the number is listed as “3.5.*” but we needed to make it higher… Replacing the “5” with a “7” is all that we needed to do. Once you have entered your new “maxVersion” number save the file. At this point you will need to re-zip all of the files back into a single file. Make certain that you “create” a file with the “.zip file extension” otherwise this will not work. Once you have the new zip file created you will need to rename the entire file including the “file extension”. For our example we copied and pasted the original extension name. Once you have changed the name click outside of the “text area”. You will see a small message window like this asking for confirmation…click “Yes” to finish the process. Now your modified/updated extension is ready to install. Drag the extension into your browser to install it and watch that wonderful “Restart to complete the installation.” message appear. As soon as your browser starts you can check the “Add-ons Manager Window” and see the version compatibility numbers for the extension. Looking very very nice! And just like that your extension should be up and running without any problems. Conclusion If you are looking to try something new, gain some geek cred, or just want to keep your Firefox install as close to the original condition as possible this method should get those extensions working nicely for you again. Similar Articles Productive Geek Tips Make Firefox Extensions Compatible After Firefox Update Breaks Them For No Good ReasonCheck Extension Compatibility for Upcoming Firefox ReleasesFirefox 3.6 Release Candidate Available, Here’s How to Fix Your Incompatible ExtensionsHow To Force Extension Compatibility with Firefox 3.6+Test and Report Add-on Compatibility in Firefox TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 PCmover Professional 15 Great Illustrations by Chow Hon Lam Easily Sync Files & Folders with Friends & Family Amazon Free Kindle for PC Download Stretch popurls.com with a Stylish Script (Firefox) OldTvShows.org – Find episodes of Hitchcock, Soaps, Game Shows and more Download Microsoft Office Help tab

    Read the article

  • Edit Media Center TV Recordings with Windows Live Movie Maker

    - by DigitalGeekery
    Have you ever wanted to take a TV program you’ve recorded in Media Center and remove the commercials or save clips of favorite scenes? Today we’ll take a look at editing WTV and DVR-MS files with Windows Live Movie Maker. Download and Install Windows Live Movie Maker. The download link can be found at the end of the article. WLMM is part of Windows Live Essentials, but you can choose to install only the applications you want. You’ll also want to be sure to uncheck any unwanted settings like settings Bing as default search provider or MSN as your browser home page.   Add your recorded TV file to WLMM by clicking the Add videos and photos button, or by dragging and dropping it onto the storyboard.   You’ll see your video displayed in the Preview window on the left and on the storyboard. Adjust the Zoom Time Scale slider at the lower right to change the level of detail displayed on the storyboard. You may want to start zoomed out and zoom in for more detailed edits.   Removing Commercials or Unwanted Sections Note: Changes and edits made in Windows Live Movie Maker do not change or effect the original video file. To accomplish this, we will makes cuts, or “splits,” and the beginning and end of the section we want to remove, and then we will delete that section from our project. Click and drag the slider bar along the the storyboard to scroll through the video. When you get to the end of a row in on the storyboard, drag the slider down to the beginning of the next row. We’ve found it easiest and most accurate to get close to the end of the commercial break and then use the Play button and the Previous Frame and Next Frame buttons underneath the Preview window to fine tune your cut point. When you find the right place to make your first cut, click the split button on the Edit tab on the ribbon. You will see your video “split” into two sections. Now, repeat the process of scrolling through the storyboard to find the end of the section you wish to cut. When you are at the proper point, click the Split button again.   Now we’ll delete that section by selecting it and pressing the Delete key, selecting remove on the Home tab, or by right clicking on the section and selecting Remove.   Trim Tool This tool allows you to select a portion of the video to keep while trimming away the rest.   Click and drag the sliders in the preview windows to select the area you want to keep. The area outside the sliders will be trimmed away. The area inside is the section that is kept in the movie. You can also adjust the Start and End points manually on the ribbon.   Delete any additional clips you don’t want in the final output. You can also accomplish this by using the Set start point and Set end point buttons. Clicking Set start point will eliminate everything before the start point. Set end point will eliminate everything after the end point. And you’re left with only the clip you want to keep.   Output your Video Select the icon at the top left, then select Save movie. All of these settings will output your movie as a WMV file, but file size and quality will vary by setting. The Burn to DVD option also outputs a WMV file, but then opens Windows DVD Maker and prompts you to create and burn a DVD.   Conclusion WLMM is one of the few applications that can edit WTV files, and it’s the only one we’re aware of that’s free. We should note only WTV and DVR-MS files will appear in the Recorded TV library in Media Center, so if you want to view your WMV output file in WMC you’ll need to add it to the Video or Movie library. Would you like to learn more about Windows Live Movie Maker? Check out are article on how to turn photos and home videos into movies with Windows Live Movie Maker. Need to add videos from a network location? WLMM doesn’t allow this by default, but you check out how to add network support to Windows Live Move Maker. Download Windows Live Similar Articles Productive Geek Tips Rotate a Video 90 degrees with VLC or Windows Live Movie MakerHow to Make/Edit a movie with Windows Movie Maker in Windows VistaFamily Fun: Share Photos with Photo Gallery and Windows Live SpacesAutomatically Mount and View ISO files in Windows 7 Media CenterAutomatically Start Windows 7 Media Center in Live TV Mode TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Xobni Plus for Outlook All My Movies 5.9 CloudBerry Online Backup 1.5 for Windows Home Server Snagit 10 Get a free copy of WinUtilities Pro 2010 World Cup Schedule Boot Snooze – Reboot and then Standby or Hibernate Customize Everything Related to Dates, Times, Currency and Measurement in Windows 7 Google Earth replacement Icon (Icons we like) Build Great Charts in Excel with Chart Advisor

    Read the article

  • Sed problem in a Bash script

    - by moata_u
    Hello there. I'm having a problem using the sed command . I'm trying to write a bash script that does the following : search for the line that contain :@ then save the line that contained :@ and replace it with new line as in the following: #! /bin/bash echo "Please enter the ip address of you file"<br> read ipnumber<br> find=`grep ':@' application.properties` # find the line<br> input="connection.url=jdbc\racle\:thin\:@$ipnumber\:1521\:billz" # preparing new line<br> echo `sed "s/'${find}'/'${input}'/g" application.properties` # replace old with new line <br> The problem is: nothing happens. I've already tried to use "${find}" instead of '${find}'

    Read the article

  • Using the Windows Explorer Context Menu to reset Umbraco Directory Permissions

    - by Vizioz Limited
    Hi All,As Umbraco matures I am assuming that needing to reset directory permissions might well become a thing of the past, but at the moment it is still something when I copy sites between machines that I often find myself doing.As it's 4:30am I thought, there must be a better way than having to open up a DOS prompt, navigate to a directory and then run a batch file passing in the IIS root folder location.Well.. there is :)I googled for adding a command to the context menu within Windows Explorer, I found a way of doing this for XP, but it seems the functionality was removed from Windows 7, however I found a very neat freeware application called File Menu Tools which does work perfectly!I have now added a command to my context menu that enables me to right click an IIS site root folder and then call my batch script and automatically pass in the directory.This will save me a bunch of time :)

    Read the article

  • Address Regulatory Mandates for Data Encryption Without Changing Your Applications

    - by Troy Kitch
    The Payment Card Industry Data Security Standard, US state-level data breach laws, and numerous data privacy regulations worldwide all call for data encryption to protect personally identifiable information (PII). However encrypting PII data in applications requires costly and complex application changes. Fortunately, since this data typically resides in the application database, using Oracle Advanced Security, PII can be encrypted transparently by the Oracle database without any application changes. In this ISACA webinar, learn how Oracle Advanced Security offers complete encryption for data at rest, in transit, and on backups, along with built-in key management to help organizations meet regulatory requirements and save money. You will also hear from TransUnion Interactive, the consumer subsidiary of TransUnion, a global leader in credit and information management, which maintains credit histories on an estimated 500 million consumers across the globe, about how they addressed PCI DSS encryption requirements using Oracle Database 11g with Oracle Advanced Security. Register to watch the webinar now.

    Read the article

  • Web-based CMS for mobile app

    - by JWood
    I'm just about to start developing a mobile app which needs to be fed from a CMS. I started designing the tables when I thought there must be something out there which could save me a load of time and let me concentrate on the mobile side of things. So, I'm looking for a CMS that will let me create hierarchical "pages" which will just be 4-5 database fields with a simple front-end to allow to edit and update them. I don't mind having to write some code to layout the database and forms etc, any saving on starting from scratch would be good. The only requirement is that I be able to access the data via some sort of web service, REST, JSON, XML, anything really... Can anyone suggest anything that might help? Thanks, J

    Read the article

  • Grontmij|Carl Bro A/S Relies on Telerik Reporting for Data Presentation and Analysis of Critical Bus

    Grontmij | Carl Bro A/S, an international company providing consultancy services in the fields of building, transportation, water, environment, energy and industry is using Telerik Reporting to save coding time and build an expandable  solution with swift performance and rich users interface. The main objective was to design and develop a web application that would provide users with an overview of construction budgets, contacts and all documents related to the properties and buildings they managed....Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • How to Automate your Database Documentation

    - by Jonathan Hickford
    In my previous post, “Automating Deployments with SQL Compare command line” I looked at how teams can automate the deployment and post deployment validation of SQL Server databases using the command line versions of Red Gate tools. In this post I’m looking at another use for the command line tools, namely using them to generate up-to-date documentation with every database change. There are many reasons why up-to-date documentation is valuable. For example when somebody new has to work on or administer a database for the first time, or when a new database comes into service. Having database documentation reduces the risks of making incorrect decisions when making changes. Documentation is very useful to business intelligence analysts when writing reports, for example in SSRS. There are a couple of great examples talking about why up to date documentation is valuable on this site:  Database Documentation – Lands of Trolls: Why and How? and Database Documentation Using SQL Doc. The short answer is that it can save you time and reduce risk when you need that most! SQL Doc is a fast simple tool that automatically generates database documentation. It can create documents in HTML, Word or pdf files. The documentation contains information about object definitions and dependencies, along with any other information you want to associate with each object. The SQL Doc GUI, which is included in Red Gate’s SQL Developer Bundle and SQL Toolbelt, allows you to add additional notes to objects, and customise which objects are shown in the docs.  These settings can be saved as a .sqldoc project file. The SQL Doc command line can use this project file to automatically update the documentation every time the database is changed, ensuring that documentation that is always up to date. The simplest way to keep documentation up to date is probably to use a scheduled task to run a script every day. However if you have a source controlled database, or are using a Continuous Integration (CI) server or a build server, it may make more sense to use that instead. If  you’re using SQL Source Control or SSDT Database Projects to help version control your database, you can automatically update the documentation after each change is made to the source control repository that contains your database. To get this automation in place,  you can use the functionality of a Continuous Integration (CI) server, which can trigger commands to run when a source control repository has changed. A CI server will also capture and save the documentation that is created as an artifact, so you can always find the exact documentation for a specific version of the database. This forms an always up to date data dictionary. If you don’t already have a CI server in place there are several you can use, such as the free open source Jenkins or the free starter editions of TeamCity. I won’t cover setting these up in this article, but there is information about using CI servers for automating database tasks on the Red Gate Database Delivery webpage. You may be interested in Red Gate’s SQL CI utility (part of the SQL Automation Pack) which is an easy way to update a database with the latest changes from source control. The PowerShell example below shows how to create the documentation from a database. That database might be your integration database or a shared development database that is always up to date with the latest changes. $serverName = "server\instance" $databaseName = "databaseName" # If you want to document multiple databases use a comma separated list $userName = "username" $password = "password" # Path to SQLDoc.exe $SQLDocPath = "C:\Program Files (x86)\Red Gate\SQL Doc 3\SQLDoc.exe" $arguments = @( "/server:$($serverName)", "/database:$($databaseName)", "/username:$($userName)", "/password:$($password)", "/filetype:html", "/outputfolder:.", # "/project:$args[0]", # If you already have a .sqldoc project file you can pass it as an argument to this script. Values in the project will be overridden with any options set on the command line "/name:$databaseName Report", "/copyrightauthor:$([Environment]::UserName)" ) write-host $arguments & $SQLDocPath $arguments There are several options you can set on the command line to vary how your documentation is created. For example, you can document multiple databases or exclude certain types of objects. In the example above, we set the name of the report to match the database name, and use the current Windows user as the documentation author. For more examples of how you can customise the report from the command line please see the SQL Doc command line documentation If you already have a .sqldoc project file, or wish to further customise the report by including or excluding specific objects, you can use this project on the command line. Any settings you specify on the command line will override the defaults in the project. For details of what you can customise in the project please see the SQL Doc project documentation. In the example above, the line to use a project is commented out, but you can uncomment this line and then pass a path to a .sqldoc project file as an argument to this script.  Conclusion Keeping documentation about your databases up to date is very easy to set up using SQL Doc and PowerShell. By using a CI server to run this process you can trigger the documentation to be run on every change to a source controlled database, and keep historic documentation available. If you are considering more advanced database automation, e.g. database unit testing, change script generation, deploying to large numbers of targets and backup/verification, please email me at [email protected] for further script samples or if you have any questions.

    Read the article

  • Week in Geek: Google Asks for Kids’ Social Security Numbers Edition

    - by Asian Angel
    This week we learned how to make hundreds of complex photo edits in seconds with Photoshop actions, use an Android Phone as a modem with no rooting required, install a wireless card in Linux using Windows drivers, change Ubuntu’s window borders with Emerald, how noise reducing headphones work, and more. Photo by Julian Fong. Latest Features How-To Geek ETC Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 How to Use Google Chrome as Your Default PDF Reader (the Easy Way) Preliminary List of Keyboard Shortcuts for Unity Now Available Bring a Touch of the Wild West to Your Desktop with the Rango Theme for Windows 7 Manage Your Favorite Social Accounts in Chrome and Iron with Seesmic E.T. II – Extinction [Fake Movie Sequel Video] Remastered King’s Quest Games Offer Classic Gaming on Modern Machines Compare Your Internet Cost and Speed to Global Averages [Infographic]

    Read the article

  • Which SSL do I need?

    - by Maik Klein
    I need to buy a ssl certificate. Now there are so many different alternatives with a huge price range. I know the very basic differences of browser compatibility and security level. But I need a "cheap" ssl certificate. My homepage looks like this http://www.test.com Now if I go to the loginpage i should switch to https like this https:/www.test.com/login I am also considering to secure the whole site if the user has singed in. Now there are sites which are offering SSl for 7$/year. Would this do the job? Or would you recommend me to get something more expensive like this one? I want to add paypal support in a later version of my website and I don't want to save money on the wrong end. What would you recommend me?

    Read the article

  • Simple Excel Export with EPPlus

    - by Jesse Taber
    Originally posted on: http://geekswithblogs.net/GruffCode/archive/2013/10/30/simple-excel-export-with-epplus.aspxAnyone I’ve ever met who works with an application that sits in front of a lot of data loves it when they can get that data exported to an Excel file for them to mess around with offline. As both developer and end user of a little website project that I’ve been working on, I found myself wanting to be able to get a bunch of the data that the application was collecting into an Excel file. The great thing about being both an end user and a developer on a project is that you can build the features that you really want! While putting this feature together I came across the fantastic EPPlus library. This library is certainly very well known and popular, but I was so impressed with it that I thought it was worth a quick blog post. This library is extremely powerful; it lets you create and manipulate Excel 2007/2010 spreadsheets in .NET code with a high degree of flexibility. My only gripe with the project is that they are not touting how insanely easy it is to build a basic Excel workbook from a simple data source. If I were running this project the approach I’m about to demonstrate in this post would be front and center on the landing page for the project because it shows how easy it really is to get started and serves as a good way to ease yourself in to some of the more advanced features. The website in question uses RavenDB, which means that we’re dealing with POCOs to model the data throughout all layers of the application. I love working like this so when it came time to figure out how to export some of this data to an Excel spreadsheet I wanted to find a way to take an IEnumerable<T> and just have it dumped to Excel with each item in the collection being modeled as a single row in the Excel worksheet. Consider the following class: public class Employee { public int Id { get; set; } public string Name { get; set; } public decimal HourlyRate { get; set; } public DateTime HireDate { get; set; } } Now let’s say we have a collection of these represented as an IEnumerable<Employee> and we want to be able to output it to an Excel file for offline querying/manipulation. As it turns out, this is dead simple to do with EPPlus. Have a look: public void ExportToExcel(IEnumerable<Employee> employees, FileInfo targetFile) { using (var excelFile = new ExcelPackage(targetFile)) { var worksheet = excelFile.Workbook.Worksheets.Add("Sheet1"); worksheet.Cells["A1"].LoadFromCollection(Collection: employees, PrintHeaders: true); excelFile.Save(); } } That’s it. Let’s break down what’s going on here: Create a ExcelPackage to model the workbook (Excel file). Note that the ‘targetFile’ value here is a FileInfo object representing the location on disk where I want the file to be saved. Create a worksheet within the workbook. Get a reference to the top-leftmost cell (addressed as A1) and invoke the ‘LoadFromCollection’ method, passing it our collection of Employee objects. Behind the scenes this is reflecting over the properties of the type provided and pulling out any public members to become columns in the resulting Excel output. The ‘PrintHeaders’ parameter tells EPPlus to grab the name of the property and put it in the first row. Save the Excel file All of the heavy lifting here is being done by the ‘LoadFromCollection’ method, and that’s a good thing. Now, this was really easy to do, but it has some limitations. Using this approach you get a very plain, un-styled Excel worksheet. The column widths are all set to the default. The number format for all cells is ‘General’ (which proves particularly interesting if you have a DateTime property in your data source). I’m a “no frills” guy, so I wasn’t bothered at all by trading off simplicity for style and formatting. That said, EPPlus has tons of samples that you can download that illustrate how to apply styles and formatting to cells and a ton of other advanced features that are way beyond the scope of this post.

    Read the article

  • Useful Sharepoint Goodies

    - by Patrick Olurotimi Ige
    I came across this list of very interesting stuff below (and it could save lots for time) 1. Faceted Search: http://facetedsearch.codeplex.com/ 2. Podcasting Kit for SharePoint: http://pks.codeplex.com/ 3. Knowledge Base: http://spkb.codeplex.com/ 4. SharePoint Branding Tool: http://brandingtool.codeplex.com/ 5. SharePoint User Account Control: http://spuac.codeplex.com/ 6. SharePoint Enhanced Calendar: http://spenhancedcalendar.codeplex.com/ 7. Enhanced Discussion Board: http://edb.codeplex.com/ 8. Wildcard Search: http://spwildcardsearch.codeplex.com/ 9. SharePoint Usage Logging Kit: http://sulk.codeplex.com/ 10. SharePoint Zip: http://sharepointzip.codeplex.com/ 11. Facebook Kit for SharePoint: http://fks.codeplex.com/ 12. Short Messages: http://spmessaging.codeplex.com/ 13. Color coded calendar: http://planetwilson.codeplex.com/Release/ProjectReleases.aspx?ReleaseId=11814 14. Most Popular Pages on SharePoint: http://popularpages.codeplex.com/   Thanks to my two bits  heput the list together

    Read the article

  • Window management shortcuts?

    - by pwnguin
    I've got a single massive monitor at home, and I've decided to mimic the Windows 7 window tiling shortcuts. I found a few guides online using wmctrl, and it's going well, save one thing: maximized windows don't respond to it. gconftool-2 --type string --set /apps/metacity/keybinding_commands/command_1 "wmctrl -r :ACTIVE: -e 0, 0,0, `xwininfo -root | grep Width | awk '{ print ($2/2)}'`, `xwininfo -root | grep Height | awk '{ print $2 }'`" (I've added line returns to make an otherwise massive one-liner readable.) I've bound this to a hotkey and it works, unless the window is maximized. Any ideas on how to fix this up?

    Read the article

  • How to make text file created in Ubuntu compatible with Windows notepad

    - by Saurav Kumar
    Sometime I have to use the text files created in Ubuntu using editor like gedit, in Windows. So basically when ever I create a text file in Ubuntu I append the extension .txt, and it does make the file open in Windows very easily. But the actual problem is the text files created in Ubuntu are so difficult to understand(read) when opened in Windows' Notepad. No matter how many new lines have been used, all the lines appear in the same line. Is there a easy way to save the text files in Ubuntu's editors so that it can be seen exactly the same in Windows' notepad? Like some plugins of gedit which make it compatible with Windows' Notepad? or something else.. I searched but didn't get any help. I appreciate for the help. Thanks in advance!!

    Read the article

  • Building InstallShield based Installers using Team Build 2010

    - by jehan
    Last few weeks, I have been working on Application Packaging stuff using all the widely used tools like InstallShield, WISE, WiX and Visual Studio Installer. So, I thought it would be good to post about how to Build the Installers developed using these tools with Team Build 2010. This post will focus on how to build the InstallShield generated packages using Team Build 2010. For the release of VS2010, Microsoft has partnered with Flexera who are the makers of InstallShield to create InstallShield Limited Edition, especially for the customers of Visual Studio. First Microsoft planned to release WiX (Windows Installer Xml) with VS2010, but later Microsoft dropped  WiX from VS2010 due to reasons which are best known to them and partnered with InstallShield for Limited Edition. It disappointed lot of people because InstallShield Limited Edition provides only few features of InstallShield and it may not feasable to build complex installer packages using this and it also requires License, where as WiX is an open source with no license costs and it has proved efficient in building most complex packages. Only the last three features are available in InstallShield Limited Edition from the total features offered by InstallShield as shown in below list.                                                                                            Feature Limited Edition for Visual Studio 2010 Standalone Build System Maintain a clean build machine by using only the part of InstallShield that compiles the installations. InstallShield Best Practices Validation Suite Avoid common installation issues. Try and Die Functionality RCreate a fully functional trial version of your product. InstallShield Repackager Create Windows Installer setups from any legacy installation. Multilingual Support Present installation text in up to 35 languages. Microsoft App-V™ Support Deploy your applications as App-V virtual packages that run without conflict. Industry-Standard InstallScript Achieve maximum flexibility in your installations. Dialog Editor Modify the layout of existing end-user dialogs, create new custom dialogs, and more. Patch Creation Build updates and patches for your products. Setup Prerequisite Editor Easily control prerequisite restart behavior and source locations. String Editor View Control the localizable text strings displayed at run time with this spreadsheet-like table. Text File Changes View Configure search-and-replace actions for content in text files to be modified at run time. Virtual Machine Detection Block your installations from running on virtual machines. Unicode Support Improve multi-language installation development. Support for 64-Bit COM Extraction Extract COM data from a 64-bit COM server. Windows Installer Installation Chaining Add MSI packages to your main installation and chain them together. XML Support Save time by quickly testing XML configuration changes to installation projects. Billboard Support for Custom Branding Display Adobe Flash billboards and other graphic files during the install process. SaaS Support (IIS 7 and SSL Technologies) Easily deploy Windows-based Web applications. Project Assistant Jumpstart a project by using a simplified set of views. Support for Digital Signatures Save time by digitally signing all your files at build time. Easily Run Custom Actions Schedule a custom action to run at precisely the right moment in your installation. Installation Prerequisites Check for and install prerequisites before your installation is executed. To create a InstallShield project in Visual Studio and Build it using Team Build 2010, first you have to add the InstallShield Project template  to your Solution file. If you want to use InstallShield Limited edition you can add it from FileàNewà project àother Project Types àSetup and Deploymentà InstallShield LE and if you are using other versions of InstallShield, then you have to add it from  from FileàNewà project àInstallShield Projects. Here, I’m using  InstallShield 2011 Premier edition as I already have it Installed. I have created a simple package for TailSpin Application which has a Feature called Web, few components and a IIS Web Site for  TailSpin application.   Before started working on this, I thought I may need to build the package by calling invoke process activity in build process template or have to create a new custom activity. But, it got build without any changes to build process template. But, it was failing with below error message. C:\Program Files (x86)\MSBuild\InstallShield\2011\InstallShield.targets (68): The "InstallShield.Tasks.InstallShield" task could not be loaded from the assembly C:\Program Files (x86)\MSBuild\InstallShield\2010Limited\InstallShield.Tasks.dll. Could not load file or assembly 'file:///C:\Program Files(x86)\MSBuild\InstallShield\2011\InstallShield.Tasks.dll' or one of its dependencies. An attempt was made to load a program with an incorrect format. Confirm that the <UsingTask> declaration is correct, that the assembly and all its dependencies are available, and that the task contains a public class that implements Microsoft.Build.Framework.ITask. This error is due to 64-bit build machine which I’m using. This issue will be replicable if you are queuing a build on a 64-bit build machine. To avoid this you have to ensure that you configured the build definition for your InstallShield project to load the InstallShield.Tasks.dll file (which is a 32-bit file); otherwise, you will encounter this build error informing you that the InstallShield.Tasks.dll file could not be loaded. To select the 32-bit version of MSBuild, click the Process tab of your build definition in Team Explorer. Then, under the Advanced node, find the MSBuild Platform setting, and select x86. Note that if you are using a 32-bit build machine, you can select either Auto or x86 for the MSBuild Platform setting.  Once I did above changes, the build got successful.

    Read the article

  • System reverts to 87Hz refresh rate at every startup after I have installed nvidia drivers

    - by Mohammad Kamil Nadeem
    Every time the system starts the screen's refresh rate reverts to 87Hz which results in a pixelated and flickery screen which I have to manually correct every time by either selecting 60Hz as my refresh rate. I have tried "save to X configuration files" and even tried by making the changes as Root but to no avail as it again reverts to 87Hz on every system startup The Open Source Drivers are Okay for regular Unity but many games don't work on it hence I had to install the nvidia drivers. I have been facing this since the Beta Phase although this is on a fresh installation of 12.04 final release. I am also providing my Xorg.conf file just in case it might help http://paste.ubuntu.com/952196/ Also for some reason Displays shows my CRT monitor as Laptop but on open source drivers it was mentioning it as a 14" CRT only This bug is also present on Edubuntu 12.04 This is not present on Xubuntu 12.04 I had selected to install updates and 3rd party software on the install and was greeted with a correct refresh rate screen on the Boot Up. I like Xubuntu.

    Read the article

  • Lego Sport Champions: Soccer – An 80s Style LEGO Video

    - by Asian Angel
    Are you a fan of LEGO and soccer? Then watch as these two teams use some fancy LEGO footwork to try and win the championship game in this nicely done retro-look video. Wait!! Is that player building a brick wall?? Lego Sport Champions: Soccer [YouTube] Latest Features How-To Geek ETC Have You Ever Wondered How Your Operating System Got Its Name? Should You Delete Windows 7 Service Pack Backup Files to Save Space? What Can Super Mario Teach Us About Graphics Technology? Windows 7 Service Pack 1 is Released: But Should You Install It? How To Make Hundreds of Complex Photo Edits in Seconds With Photoshop Actions How to Enable User-Specific Wireless Networks in Windows 7 Lego Sport Champions: Soccer – An 80s Style LEGO Video Access the Options for Your Favorite Extensions Easier in Firefox Don’t Sleep Keeps Your Windows Machine Awake DropSpace Syncs Android Files to Dropbox Field of Poppies Wallpaper The History Of Operating Systems [Infographic]

    Read the article

  • How to Import Data Taxonomy Into Managed Metada Service in SharePoint 2010

    - by Wayne
    First, Open the Term Store Management Tool (Site Actions > Site Settings > Term Store Management) an download the sample import file. (Remember, Service Applications are configured on a per Web Application basis, so use any site collection inside a WebApp configured with your MMS.) Second, Insert your data. In the photo below, I demonstrate creating a term called USA. Under that, I create the term Alabama. Under that, 4 cities. Then under USA, a term called Alaska. The point is that the we have a hierarchy. Using the import file, we can go 7 layers deep. The last steps is Save the file, head back into the Term Store Management Tool, select/create a group, and Import the file.

    Read the article

  • Sitefinity SimpleImageSelector to return Url of image instead of Guid

    - by Joey Brenn
    It's been quite a while but I've found something to blog about!I've been working with Sitefinity for some time now and one of the things that I've struggled with, and I'm not the only one is something that should be simple.  See, all I want to do is be able to choose a picture from one of the libraries within Sitefinity and be able to display it via the GUID it returns or the path of the URL.  I want to do this from my user control or a custom control.Well, it turns out that this is not built in, at least I've not been able to get anything working correctly until I found this post and was able to get it to work.  However, I want to store the relative URL of the image so I made a small change to make it return the URL instead of the GUID.To make the change, in the SimpleImageSelectorDialog.js file, on line 43, change the original line:var selectedValue = this.get_imageSelector().get_selectedImageId();to the new line:var selectedValue = this.get_imageSelector().get_selectedImageUrl();var selectedValue = this.get_imageSelector().get_selectedImageUrl();Of course, save and recomple the project and now it will return the URL instead of the GUID of the image from the choosen Album.

    Read the article

  • Application for taking pretty screenshots (like OS X does)

    - by Oli
    I've been building a website for a guy who uses Mac OS X and occasionally he sends me screenshots of bugs. They come out looking like this: This is fairly typical of Mac screenshots. You get the window decorations, the shadow from the window and a white or transparent background (not the desktop wallpaper -- I've checked). Compare this to an Ubuntu window-shot (Alt+Print screen): It's impossible to keep a straight face and say the Ubuntu one anywhere near as elegant. My question is: Is there an application that can do this in Ubuntu? Edit: Follow up: Is there an application that can do this in one move? Shutter is pretty good but running the plugin for every screenshot is pretty tiresome as it doesn't seem to remember my preference (I want south-shadow and that requires selecting south, then clicking refresh, then save) and it's more clicks than I'd like. Is there a simple way of telling shutter I want south-shadow for all screenshots (except entire desktop and area-selection)?

    Read the article

  • Using the Static Code Analysis feature of Visual Studio (Premium/Ultimate) to find memory leakage problems

    - by terje
    Memory for managed code is handled by the garbage collector, but if you use any kind of unmanaged code, like native resources of any kind, open files, streams and window handles, your application may leak memory if these are not properly handled.  To handle such resources the classes that own these in your application should implement the IDisposable interface, and preferably implement it according to the pattern described for that interface. When you suspect a memory leak, the immediate impulse would be to start up a memory profiler and start digging into that.   However, before you follow that impulse, do a Static Code Analysis run with a ruleset tuned to finding possible memory leaks in your code.  If you get any warnings from this, fix them before you go on with the profiling. How to use a ruleset In Visual Studio 2010 (Premium and Ultimate editions) you can define your own rulesets containing a list of Static Code Analysis checks.   I have defined the memory checks as shown in the lists below as ruleset files, which can be downloaded – see bottom of this post.  When you get them, you can easily attach them to every project in your solution using the Solution Properties dialog. Right click the solution, and choose Properties at the bottom, or use the Analyze menu and choose “Configure Code Analysis for Solution”: In this dialog you can now choose the Memorycheck ruleset for every project you want to investigate.  Pressing Apply or Ok opens every project file and changes the projects code analysis ruleset to the one we have specified here. How to define your own ruleset  (skip this if you just download my predefined rulesets) If you want to define the ruleset yourself, open the properties on any project, choose Code Analysis tab near the bottom, choose any ruleset in the drop box and press Open Clear out all the rules by selecting “Source Rule Sets” in the Group By box, and unselect the box Change the Group By box to ID, and select the checks you want to include from the lists below. Note that you can change the action for each check to either warning, error or none, none being the same as unchecking the check.   Now go to the properties window and set a new name and description for your ruleset. Then save (File/Save as) the ruleset using the new name as its name, and use it for your projects as detailed above. It can also be wise to add the ruleset to your solution as a solution item. That way it’s there if you want to enable Code Analysis in some of your TFS builds.   Running the code analysis In Visual Studio 2010 you can either do your code analysis project by project using the context menu in the solution explorer and choose “Run Code Analysis”, you can define a new solution configuration, call it for example Debug (Code Analysis), in for each project here enable the Enable Code Analysis on Build   In Visual Studio Dev-11 it is all much simpler, just go to the Solution root in the Solution explorer, right click and choose “Run code analysis on solution”.     The ruleset checks The following list is the essential and critical memory checks.  CheckID Message Can be ignored ? Link to description with fix suggestions CA1001 Types that own disposable fields should be disposable No  http://msdn.microsoft.com/en-us/library/ms182172.aspx CA1049 Types that own native resources should be disposable Only if the pointers assumed to point to unmanaged resources point to something else  http://msdn.microsoft.com/en-us/library/ms182173.aspx CA1063 Implement IDisposable correctly No  http://msdn.microsoft.com/en-us/library/ms244737.aspx CA2000 Dispose objects before losing scope No  http://msdn.microsoft.com/en-us/library/ms182289.aspx CA2115 1 Call GC.KeepAlive when using native resources See description  http://msdn.microsoft.com/en-us/library/ms182300.aspx CA2213 Disposable fields should be disposed If you are not responsible for release, of if Dispose occurs at deeper level  http://msdn.microsoft.com/en-us/library/ms182328.aspx CA2215 Dispose methods should call base class dispose Only if call to base happens at deeper calling level  http://msdn.microsoft.com/en-us/library/ms182330.aspx CA2216 Disposable types should declare a finalizer Only if type does not implement IDisposable for the purpose of releasing unmanaged resources  http://msdn.microsoft.com/en-us/library/ms182329.aspx CA2220 Finalizers should call base class finalizers No  http://msdn.microsoft.com/en-us/library/ms182341.aspx Notes: 1) Does not result in memory leak, but may cause the application to crash   The list below is a set of optional checks that may be enabled for your ruleset, because the issues these points too often happen as a result of attempting to fix up the warnings from the first set.   ID Message Type of fault Can be ignored ? Link to description with fix suggestions CA1060 Move P/invokes to NativeMethods class Security No http://msdn.microsoft.com/en-us/library/ms182161.aspx CA1816 Call GC.SuppressFinalize correctly Performance Sometimes, see description http://msdn.microsoft.com/en-us/library/ms182269.aspx CA1821 Remove empty finalizers Performance No http://msdn.microsoft.com/en-us/library/bb264476.aspx CA2004 Remove calls to GC.KeepAlive Performance and maintainability Only if not technically correct to convert to SafeHandle http://msdn.microsoft.com/en-us/library/ms182293.aspx CA2006 Use SafeHandle to encapsulate native resources Security No http://msdn.microsoft.com/en-us/library/ms182294.aspx CA2202 Do not dispose of objects multiple times Exception (System.ObjectDisposedException) No http://msdn.microsoft.com/en-us/library/ms182334.aspx CA2205 Use managed equivalents of Win32 API Maintainability and complexity Only if the replace doesn’t provide needed functionality http://msdn.microsoft.com/en-us/library/ms182365.aspx CA2221 Finalizers should be protected Incorrect implementation, only possible in MSIL coding No http://msdn.microsoft.com/en-us/library/ms182340.aspx   Downloadable ruleset definitions I have defined three rulesets, one called Inmeta.Memorycheck with the rules in the first list above, and Inmeta.Memorycheck.Optionals containing the rules in the second list, and the last one called Inmeta.Memorycheck.All containing the sum of the two first ones.  All three rulesets can be found in the  zip archive  “Inmeta.Memorycheck” downloadable from here.   Links to some other resources relevant to Static Code Analysis MSDN Magazine Article by Mickey Gousset on Static Code Analysis in VS2010 MSDN :  Analyzing Managed Code Quality by Using Code Analysis, root of the documentation for this Preventing generated code from being analyzed using attributes Online training course on Using Code Analysis with VS2010 Blogpost by Tatham Oddie on custom code analysis rules How to write custom rules, from Microsoft Code Analysis Team Blog Microsoft Code Analysis Team Blog

    Read the article

  • SQL SERVER – Powershell – Importing CSV File Into Database – Video

    - by pinaldave
    Laerte Junior is my very dear friend and Powershell Expert. On my request he has agreed to share Powershell knowledge with us. Laerte Junior is a SQL Server MVP and, through his technology blog and simple-talk articles, an active member of the Microsoft community in Brasil. He is a skilled Principal Database Architect, Developer, and Administrator, specializing in SQL Server and Powershell Programming with over 8 years of hands-on experience. He holds a degree in Computer Science, has been awarded a number of certifications (including MCDBA), and is an expert in SQL Server 2000 / SQL Server 2005 / SQL Server 2008 technologies. Let us read the blog post in his own words. I was reading an excellent post from my great friend Pinal about loading data from CSV files, SQL SERVER – Importing CSV File Into Database – SQL in Sixty Seconds #018 – Video,   to SQL Server and was honored to write another guest post on SQL Authority about the magic of the PowerShell. The biggest stuff in TechEd NA this year was PowerShell. Fellows, if you still don’t know about it, it is better to run. Remember that The Core Servers to SQL Server are the future and consequently the Shell. You don’t want to be out of this, right? Let’s see some PowerShell Magic now. To start our tour, first we need to download these two functions from Powershell and SQL Server Master Jedi Chad Miller.Out-DataTable and Write-DataTable. Save it in a module and add it in your profile. In my case, the module is called functions.psm1. To have some data to play, I created 10 csv files with the same content. I just put the SQL Server Errorlog into a csv file and created 10 copies of it. #Just create a CSV with data to Import. Using SQLErrorLog [reflection.assembly]::LoadWithPartialName(“Microsoft.SqlServer.Smo”) $ServerInstance=new-object (“Microsoft.SqlServer.Management.Smo.Server“) $Env:Computername $ServerInstance.ReadErrorLog() | export-csv-path“c:\SQLAuthority\ErrorLog.csv”-NoTypeInformation for($Count=1;$Count-le 10;$count++)  {       Copy-Item“c:\SQLAuthority\Errorlog.csv”“c:\SQLAuthority\ErrorLog$($count).csv” } Now in my path c:\sqlauthority, I have 10 csv files : Now it is time to create a table. In my case, the SQL Server is called R2D2 and the Database is SQLServerRepository and the table is CSV_SQLAuthority. CREATE TABLE [dbo].[CSV_SQLAuthority]( [LogDate] [datetime] NULL, [Processinfo] [varchar](20) NULL, [Text] [varchar](MAX) NULL ) Let’s play a little bit. I want to import synchronously all csv files from the path to the table: #Importing synchronously $DataImport=Import-Csv-Path ( Get-ChildItem“c:\SQLAuthority\*.csv”) $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable-ServerInstanceR2D2-DatabaseSQLServerRepository-TableNameCSV_SQLAuthority-Data$DataTable Very cool, right? Let’s do it asynchronously and in background using PowerShell  Jobs: #If you want to do it to all asynchronously Start-job-Name‘ImportingAsynchronously‘ ` -InitializationScript  {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock {    ` $DataImport=Import-Csv-Path ( Get-ChildItem“c:\SQLAuthority\*.csv”) $DataTable=Out-DataTable-InputObject$DataImport Write-DataTable   -ServerInstance“R2D2″`                   -Database“SQLServerRepository“`                   -TableName“CSV_SQLAuthority“`                   -Data$DataTable             } Oh, but if I have csv files that are large in size and I want to import each one asynchronously. In this case, this is what should be done: Get-ChildItem“c:\SQLAuthority\*.csv” | % { Start-job-Name“$($_)” ` -InitializationScript  {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0]                $DataTable=Out-DataTable-InputObject$DataImport                Write-DataTable-ServerInstance“R2D2″`                               -Database“SQLServerRepository“`                               -TableName“CSV_SQLAuthority“`                               -Data$DataTable             } -ArgumentList$_.fullname } How cool is that? Let’s make the funny stuff now. Let’s schedule it on an SQL Server Agent Job. If you are using SQL Server 2012, you can use the PowerShell Job Step. Otherwise you need to use a CMDexec job step calling PowerShell.exe. We will use the second option. First, create a ps1 file called ImportCSV.ps1 with the script above and save it in a path. In my case, it is in c:\temp\automation. Just add the line at the end: Get-ChildItem“c:\SQLAuthority\*.csv” | % { Start-job-Name“$($_)” ` -InitializationScript  {IpmoFunctions-Force-DisableNameChecking} ` -ScriptBlock { $DataImport=Import-Csv-Path$args[0]                $DataTable=Out-DataTable-InputObject$DataImport                Write-DataTable-ServerInstance“R2D2″`                               -Database“SQLServerRepository“`                               -TableName“CSV_SQLAuthority“`                               -Data$DataTable             } -ArgumentList$_.fullname } Get-Job | Wait-Job | Out-Null Remove-Job -State Completed Why? See my post Dooh PowerShell Trick–Running Scripts That has Posh Jobs on a SQL Agent Job Remember, this trick is for  ALL scripts that will use PowerShell Jobs and any kind of schedule tool (SQL Server agent, Windows Schedule) Create a Job Called ImportCSV and a step called Step_ImportCSV and choose CMDexec. Then you just need to schedule or run it. I did a short video (with matching good background music) and you can see it at: That’s it guys. C’mon, join me in the #PowerShellLifeStyle. You will love it. If you want to check what we can do with PowerShell and SQL Server, don’t miss Laerte Junior LiveMeeting on July 18. You can have more information in : LiveMeeting VC PowerShell PASS–Troubleshooting SQL Server With PowerShell–English Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, SQL Utility, T SQL, Technology, Video Tagged: Powershell

    Read the article

  • Tool to export Microsoft project to website?

    - by Rory
    Just wondering does anyone know of a free/open source tool that take a Microsoft project file and export it to HTML? I know you can save a project file as HTML, so wanted a tool that would do this automatically? Maybe also displaying graphs/gantt chart as well? If not, any ideas of how I would write a program to do this, preferably in java? I know of Aspose.Tasks (http://www.aspose.com/categories/.net-components/aspose.tasks-for-.net/default.aspx), which can export projects files to gantt charts in png format, but it's not free and is only available in C#.

    Read the article

< Previous Page | 165 166 167 168 169 170 171 172 173 174 175 176  | Next Page >