Search Results

Search found 3177 results on 128 pages for 'david talamelli'.

Page 5/128 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • Rankings dropping after small URL-change WITH 301-redirect

    - by David
    Two weeks ago, we attempted to make the URLs of ca. 12 pages more search-engine friendly. We changed three things. 1. Make URLs more SEF from: /????-????/brandname.html (meaning: /aircon-price/daikin.html to: /????-brandnameinenglish-brandnameinthai.html We set up 301-redirects from the old to the new URLs. You can find an example and the link to our page here: http://bit.ly/XRoTOK There are no direct external links to the old URLs. 2. Added text to img-links from homepage to brand-pages Before those changes, we only linked to those brands with a picture, so we added some text under the picture. You can see that here, in the left submenu: http://bit.ly/XRpfoF 3. Minor changes to Title, h1-Tags, Meta Description, etc. Only minor changes, to better match the on-site optimization with targeted keywords. For example, before we used full brand names, after we used what was really searched for: from: Mitsubishi Electric Mr. Slim to: ???? Mitsubishi (means: Aircon Mitsubishi) Three days after these changes, we noticed a heavy drop (80% loss in non-paid search traffic) in rankings and traffic for those pages, and also for all pages which are sub-categorized. Rankings for all keywords not affected by the changes stayed the same. Any ideas, what happened, and how we can regain our old rankings? What we already did, was submitting a new sitemap. Help much appreciated. Best regards, David

    Read the article

  • Compare two NTP servers

    - by David Turner
    Hi, I want to compare the time used by our internal servers against time.microsoft.com. Is there an easy way to do this? Basically a third party sends me messages stamped with a time that has been synced iwth time.microsoft.com, unfortunately our servers are using a different time server, so I want to calculate if there is a significant difference between the our NTP synced time, and theirs. Is there a simple way to accurately compare times? regards, David.

    Read the article

  • nginx proxy to different path

    - by David Robertson
    I've read through the documentation for nginx's HttpProxyModule, but I can't figure this out: I want it so that if someone visits, for example http://ss.example.com/1339850978, nginx will proxy them http://dl.dropbox.com/u/xxxxx/screenshots/1339850978.png. If I was to just use this line in my config file: proxy_pass http://dl.dropbox.com/u/xxxxx/screenshots/;, then they would have to append the .png themselves. tia, David.

    Read the article

  • Correct Display configuration. Errors while trying to arrange displays

    - by David Russell Parrish Bojrquez
    I am trying to set up my tv with my laptop trough a VGA cable. The display application in Ubuntu throws a lot of errors to me and I have given up in trying to do it myself. I try to apply the 1920 1080 display. The selected configuration for displays could not be applied Requested size (3200, 1080) exceeds 3D hardware limit (2048, 2048). You must either rearrange the displays so that they fit within a (2048, 2048) square or select the Ubuntu 2D session at login. And Also this: Failed to apply configuration: %s GDBus.Error:org.gtk.GDBus.UnmappedGError.Quark._gnome_2drr_2derror_2dquark.Code3: Requested size (3200, 1080) exceeds 3D hardware limit (2048, 2048). You must either rearrange the displays so that they fit within a (2048, 2048) square or select the Ubuntu 2D session at login. Please Help. @Leozitop No I don't see anything when connected to 1920 1080 because the setup fails before actually applying. Yes there are other resolutions which do work. I believe the problem has something to do with the rotation it is set up. My Ubuntu Display application has only clockwise and counterclockwise options for the TV display. I really don't know why this is happening. Basic procedure: Plug in cable, did not get the resolution I wanted. Changed settings, applied them. Re-peat until desired display is shown. I'm not a computer illiterate, really it baffles me that this is happening. Output of xrandr: david@LapUbuntu:~$ xrandr Screen 0: minimum 320 x 200, current 1880 x 800, maximum 4096 x 4096 LVDS1 connected 1280x800+0+0 (normal left inverted right x axis y axis) 331mm x 207mm 1280x800 60.0*+ 1024x768 60.0 800x600 60.3 56.2 640x480 59.9 VGA1 connected 600x800+1280+0 left (normal left inverted right x axis y axis) 1600mm x 900mm 1920x1080 60.0 + 1280x1024 60.0 1360x768 60.0 1280x720 60.0 1024x768 60.0 800x600 60.3* 640x480 60.0 TV1 unknown connection (normal left inverted right x axis y axis) 848x480 59.9 + 640x480 59.9 + 1024x768 59.9 800x600 59.9 Note that VGA says left and indeed it is, but no other option was available in the display. Also, note the TV1 unknown connection which I have no idea what it is. Note, also, that this has nothing to do with the display since W7 on the computer works fine and since while boot up, and also, before starting session in ubuntu the rotation is normal. I'll also mention that I HAVE re-installed Ubuntu since I had posted this question from a Live CD of 12.04 LTS. And that before the posting of the question also using 12.04 before another backup that I had to do, the VGA setup was fine without any problems.

    Read the article

  • Cooking with Wessty: HTML 5 and Visual Studio

    - by David Wesst
    The hardest part about using a new technology, such as HTML 5, is getting to what features are available and the syntax. One way to learn how to use new technologies is to adapt your current development to help you use the technology in comfort of your own development environment. For .NET Web Developers, that environment is usually Visual Studio 2010. This technique intends on showing you how to get HTML 5 Intellisense working in your current version of Visual Studio 2008 or 2010, making it easier for you to start using HTML 5 features in your current .NET web development projects. Quick Note According to the Visual Web Developer team at Microsoft, the Visual Studio 2010 SP1 beta has support for both HTML 5 and CSS 3. If you are willing to try out the bleeding edge update from Microsoft, then you won’t need this technique. --- Ingredients Visual Studio 2008 or 2010 Your favourite HTML 5 compliant browser (e.g. Internet Explorer 9) Administrator privileges, or the ability to install Visual Studio Extensions in your development environment. Directions Download the HTML 5 Intellisense for Visual Studio 2008 and 2010 extension from the Visual Studio Extension Gallery. Install it. Open Visual Studio. Open up a web file, such as an HTML or ASPX file. he HTML Source Editing toolbar should have appeared. (Optional) If it did not appear, you can activate it through the main menu by selecting View, then Toolbars, and then select HTML Source Editing if it does not have a checkbox beside it. (NOTE: If there is a checkbox, then the toolbar is enabled) In the HTML Source Editing toolbar, open up the validation schema drop box, and select HTML 5. Et voila! You now have HTML 5 intellisense enabled to help you get started in adding HTML 5 awesomeness to your web sites and web applications. Optional – Setting HTML 5 Validation Options At this point, you may want to select how Visual Studio shows validation errors. You can do that in the Options Menu. To get to the Options Menu… In the main menu select Tools, then Options. In the Options window, select and expand Text Editor, then HTML, followed by selecting Validation. Resources HTML 5 Intellisense for Visual Studio 2008 and 2010 extenstion Visual Studio Extension Gallery Visual Studio 2010 SP1 Beta This post also appears at http://david.wes.st

    Read the article

  • MDM Poised for Growth

    - by david.butler(at)oracle.com
    David Nixon, an Oracle colleague of mine, was doing some research on MDM the other day. He came up with some well founded insights that I thought I’d share with you. Gartner recently published a note asking “Should Organizations Using ERP 'Do' Master Data Management?”  It may seem a bit strange but that’s a question Gartner has been asked by a number of companies as organizations are beginning to understand the importance of data governance and data stewardship.  That’s because ERP Suites typically “focus on integrating their own applications within suites, but have little interest in making their suites interoperate with the applications or suites of other vendors.”  Therefore, Gartner is advising customers that “have deployed or plan to support multiple packaged application suites (even from the same vendor) that have different semantic data and/or process models” to add an MDM solution. And it appears that customers are taking note.  In a more recent note entitled “Search Analytics Trends: Master Data Management”, Gartner noted that MDM searches on gartner.com in November 2010 “were 300% higher than [in] May 2009, indicating the increased interest an importance that businesses are placing on MDM.”  Why the increased interest?  Moving towards a single version of the truth is a familiar theme, but customers are talking more about the underlying business value that this enables.  For example, businesses are talking about the need to fix master data before they can successfully move forward on SOA initiatives.  And the growing demands for compliance continue to be a major driver.  In short, companies are talking more about specific and tangible business value, and they are looking for help creating business cases for an MDM initiative. Why This Matters Gartner’s notes make three things clear.  First, MDM is poised for growth as organizations gain a greater understanding for it and the need they have.  Many are still sorting it out, but the demand is growing and is sure to rise.  Second, any organization with a heterogeneous computing environment should invest in MDM.  Even solutions from the same vendor may have different data models and could benefit from MDM.  But the key to growth, or which vendors will benefit the most from it, is the third and perhaps most critical point: companies need help with the business case for MDM. Oracle can help your organization build a compelling business case for MDM. We have seen our 1100+ MDM customers gain competitive advantages in a wide variety of implementations. Give us a ring.

    Read the article

  • Cooking with Wessty: WordPress and HTML 5

    - by David Wesst
    WordPress is easily one, if not the most, popular blogging platforms on the web. With the release of WordPress 3.x, the potential for what you can do with this open source software is limitless. This technique intends to show you how to get your WordPress wielding the power of the future web, that being HTML 5. --- Ingredients WordPress 3.x Your favourite HTML 5 compliant browser (e.g. Internet Explorer 9) Directions Setup WordPress on your server or host. Note: You can setup a WordPress.com account, but you will require an paid add-on to really take advantage of this technique.Login to the administration panel. Login to the administration section of your blog, using your web browser.  On the left side of the page, click the Appearance heading. Then, click on Themes. At the top of the page, select the Install Themes tab. In the search box, type the “toolbox” and click search. In the search results, you should see an theme called Toolbox. Click the Install link in the Toolbox item. A dialog window should appear with a sample picture of what the theme looks like. Click on the Install Now button in the bottom right corner. Et voila! Once the installation is done, you are done and ready to bring your blog into the future of the web. Try previewing your blog in HTML 5 by clicking the preview link.   Now, you are probably thinking “Man…HTML 5 looks like junk”. To that, I respond: “HTML was never why your site looked good in the first place. It was the CSS.” Now you have an un-stylized theme that uses HTML 5 elements throughout your WordPress site. If you want to learn how to apply CSS to your WordPress blog, you should check out the WordPress codex that pretty much covers everything there is to cover about WordPress development. Now, remember how we noted earlier that your free WordPress.com account wouldn’t take advantage of this technique? That is because, as of the time of this writing, you needed to pay a fee to use custom CSS. Remember now, this only gives you the foundation to create your own HTML 5 WordPress site. There are some HTML 5 themes out there that already look good, and were built using this as the foundation and added some CSS 3 to really spice it up. Looking forward to seeing more HTML 5 WordPress sites! Enjoy developing the future of the web. Resources Toolbox Theme JustCSS Theme WordPress Installation Tutorial WordPress Theme Development Tutorial This post also appears at http://david.wes.st

    Read the article

  • How to keep groups when pulling with git

    - by mimrock
    I have a staging site that is a working directory of a git repository. How to set up git to let a developer pull out a branch or release without changing the group of the modified files? An example. Let's say I have two developers, robin and david. They are both in git-users group, so initially they can both have write permissions on site.php. -rw-rw-r-- 1 robin git-users 46068 Nov 16 12:12 site.php drwxrwxr-x 8 robin git-users 4096 Nov 16 14:11 .git After robin-server1$ git pull origin master: -rw-rw-r-- 1 robin robin 46068 Nov 16 12:35 site.php drwxrwxr-x 8 robin git-users 4096 Nov 16 14:11 .git And david do not have write permissions on site.php, because the group changed from 'git-users' to 'robin'. From now on, david will get a permission denied, when he tries to pull to this repository.

    Read the article

  • Learn Domain-Driven Design

    - by Ben Griswold
    I just wrote about how I like to present on unfamiliar topics. With this said, Domain-Driven Design (DDD) is no exception. This is yet another area I knew enough about to be dangerous but I certainly was no expert.  As it turns out, researching this topic wasn’t easy. I could be wrong, but it is as if DDD is a secret to which few are privy. If you search the Interwebs, you will likely find little information about DDD until you start rolling over rocks to find that one great write-up, a handful of podcasts and videos and the Readers’ Digest version of the Blue Book which apparently you must read if you really want to get the complete, unabridged skinny on DDD.  Even Wikipedia’s write-up is skimpy which I didn’t know was possible…   Here’s a list of valuable resources.  If you, too, are interested in DDD, this is a good starting place.  Domain-Driven Design: Tackling Complexity in the Heart of Software by Eric Evans Domain-Driven Design Quickly, by Abel Avram & Floyd Marinescu An Introduction to Domain-Driven Design by David Laribee Talking Domain-Driven Design with David Laribee Part 1, Deep Fried Bytes Talking Domain-Driven Design with David Laribee Part 2, Deep Fried Bytes Eric Evans on Domain Driven Design, .NET Rocks Domain-Driven Design Community Eric Evans on Domain Driven Design Jimmy Nilsson on Domain Driven Design Domain-Driven Design Wikipedia What I’ve Learned About DDD Since the Book, Eric Evans Domain Driven Design, Alt.Net Podcast Applying Domain-Driven Design and Patterns: With Examples in C# and .NET, Jimmy Nilsson Domain-Driven Design Discussion Group DDD: Putting the Model to Work by Eric Evans The Official DDD Site

    Read the article

  • Outstanding Silverlight User Group Meeting last night

    - by Dave Campbell
    We had a great Silverlight User Group Meeting in Phoenix last night! Before I go any farther I want to say thanks again to David Silverlight and Kim Schmidt for coming to talk to us! And not to forget Victor Gaudioso over the wire :) David, Kim, and Victor talked to us about the Silverlight User Group Starter Kit they are working on with an extended stellar list of talented developers. Don't bypass looking at this by thinking it's only for a User Group... this is a solid community-supported full-up application using MVVM and Ria Services that you could take and modify for your own use. Take a look at the list of developers. Chances are you know some of them... send them an email of thanks for all the hard work over the last year! David and Kim discussed the architecture and code, demonstrating features as they went. Then Victor came in through the application itself on a high-intensity live webcast from his home in California. The audience of about 15 seemed focused and interested which says a lot about the subject and presentation. Tim Heuer came bearing some gifts (swag) ... a hard-copy of Josh Smith's Advanced MVVM , and couple cheaply upgradeable copies of VS2008 Pro that were snatched up very quickly. We also gave away a few copies of Windows 7 Ultimate 64-bit, some Arc mice, and some Office 2007 disks... so I don't think anyone left empty-handed. Personal thanks from me go out to Mike Palermo and Tim Heuer for the surprise they had waiting for me that's been over Twitter, and to Victor for only mentioning it at least 3 times in a 5-minute webcast. Thanks for a great evening, and I look forward to seeing all of you in a couple weeks at MIX10!

    Read the article

  • Oracle Customer Experience Summit @ OpenWorld

    - by Michael Seback
    Businesses worldwide are operating in a new era. Customers are taking charge of their relationships with brands, and the customer experience has become the most important differentiator and driver of business value. Where is the experience heading? And how can businesses take advantage of the customer experience revolution?  Find out from experts at a one-of-a-kind event:  Oracle Customer Experience Summit @ OpenWorld Preview the Conference Schedule for October 3 – 5, 2012 Registration - Wednesday October 3, 7:00 a.m.–6:30 p.m. Westin St. Francis, Moscone West, South, Hilton San Francisco, and Hotel Nikko Sample Sessions: The Experience Imperative - Wednesday October 3, 12:30 p.m.–2:30 p.m. Mark Hurd, President, Oracle Anthony Lye, Senior Vice President, Oracle Cloud Applications Strategy David Vap, Global Vice President, Product Development, Oracle Mike Svatek, Chief Strategy Officer, Bazaarvoice Leading the Experience Revolution - Wednesday October 3, 3:45 p.m.–4:45 p.m. Seth Godin, Best-Selling Author, Founder of Squidoo.com David Vap, Global Vice President, Product Development, Oracle Driving a Customer Experience Strategy - Wednesday October 3, 5:00 p.m.–6:00 p.m. David Vap, Global Vice President, Product Development, Oracle Matthew Banks, Senior Director, Customer Experience Solutions, Oracle Register now.

    Read the article

  • IE9 and the Mystery of the Broken Video Tag

    - by David Wesst
    I was very excited when Microsoft released the Internet Explorer 9 Release Candidate. As far as I was concerned, this was another nail in the coffin for IE6 and step in the right direction for us .NET web developers as our base camp was finally starting to support the latest and greatest future-web standards. Unfortunately, my celebration was short lived as I soon hit a snag while loading up an HTML5 site I was building in Visual Studio 2010. The Mystery After updating Internet Explorer, I ran my HTML5 site that had the oh-so-lovely HTML5 video tag showing a video. Even though this worked in IE9 Beta, it appeared that IE9 RC could not load the same file. I figured that it was the video codec. Maybe IE9 RC no longer supported the video codec I used to encode my video. Here's the code I used: <video width="854" height="480" id="myOtherVideo" autoplay="" controls=""> <source src="/DemoSite1/Media/big_buck_bunny.mp4"/> <div> <p>Your browser does not support HTML5 Video.</p> </div> </video> As you can see from the code, I had the "fail-safe" code inside the video tag. The idea there being that if the video tag, or the video files themselves, are not supported by the browser my video should fail gracefully. What was even more strange was the fact that it worked in all the other HTML5 browsers that supported video. The Investigation Whoa! DJ stop the music. How can any of that make sense? Would the IE team really take such huge strides forward only to forget to include a feature that was already in the beta? I don't think so. I did plenty of searching on the web and asking around on the web, but could not seem to find anyone else having the same problem. Eventually I came across this post talking about declaring the MIME type in the .htaccess file. That got me thinking: does my web server support the video MIME type? I was using VS2010, so how do I know what kind of MIME types are supported by default? Still, my page hosted in Cassini (the web development server in VS2010) works on the other browsers. Why wouldn't it work with IE9 RC? To answer that, it was time to open up the upgraded toolbox known as the Developer's Tools in IE9 and use the new Network Tab. The Conclusion If you take a closer look at the results displayed from the Network tab, you can see that IE9 RC has interpreted the video file as text/html rather than video/mp4. To make this work, I decided to use IIS to debug my HTML5 web application by setting the web project's properties. Then, I added the MIME types that I want to support (i.e. video/mp4, video/ogg, video/webm). Et voila! The Mystery of the Broken Video Tag is solved. After Thoughts After solving the mystery, I still had the question about why my site worked in Chrome, Safari, and Firefox 3.6. After asking around, the best answer that I received was from my colleague Tyler Doerksen. He said that IE9 likely depends on the server telling it what kind of file it is downloading rather than trying to read the metadata about the data it is trying to download before doing anything. I have no facts to back this up, but it makes sense to me. In a browser war where milliseconds can make your browser fall back a few places in the race for supremacy, maybe the IE team opted to depend on the server knowing what kind of content it is serving up. Makes sense to me. In any case, that is just an educated guess. If you have any comments, feel free to post on them below. This post also appears at http://david.wes.st

    Read the article

  • 5 Ways to Celebrate the Release of Internet Explorer 9

    - by David Wesst
    The day has finally come: Microsoft has released a web browser that is awesome. On Monday night, Microsoft officially introduced the world to the latest edition to its product family: Internet Explorer 9. That makes March 14, 2011 (also known as PI day) the official birthday of Microsoft’s rebirth in the world of web browsing. Just like any big event, you take some time to celebrate. Here are a few things that you can do to celebrate the return of Internet Explorer. 1. Download It If you’re not a big partier, that’s fine. The one thing you can do (and definitely should) is download it and give it a shot. Sure, IE may have disappointed you in the past, but believe me when I say they really put the effort in this time. The absolute least you can do is give it a shot to see how it stands up against your favourite browser. 2. Get yourself an HTML5 Shirt One of the coolest, if not best parts of IE9 being released is that it officially introduces HTML5 as a fully supported platform from Microsoft. IE9 supports a lot of what is already defined in the HTML5 technical spec, which really demonstrates Microsoft’s support of the new standard. Since HTML5 is cool on the web, it means that it is cool to wear it too. Head over to html5shirt.com and get yourself, or your staff, or your whole family, an HTML5 shirt to show the real world that you are ready for the future of the web. 3. HTML5-ify Something Okay, so maybe a shirt isn’t enough for you. Maybe you need start using HTML5 for real. If you have a blog, or a website, or anything out there on the web, celebrate IE9 adding some HTML5 to your site. Whether that is updating old code, adding something new, or just changing your WordPress theme, definitely take a look at what HTML5 can do for you. 4. Help Kill Old IE and Upgrade your Organization See this? This is sad. Upgrading web browsers in an large enterprise or organization is not a trivial task. A lot of companies will use the excuse of not having the resources to upgrade legacy web applications they were built for a specific version of IE and it doesn’t render correctly in legacy browsers. Well, it’s time to stop the excuses. IE9 allows you to define what version of Internet Explorer you would like it to emulate. It takes minimal effort for the developer, and will get rid of the excuses. Show your IT manager or software development team this link and show them how easy it is to make old code render right in the latest and greatest from the IE team. 5. Submit an Entry for DevUnplugged So, you’ve made it to number five eh? Well then, you must be pretty hardcore to make it this far down the list. Fine, let’s take it to the next level and build an HTML5 game. That’s right. A game. Like a video game. HTML5 introduces some amazing new features that can let you build working video games using HTML5, CSS3, and JavaScript. Plus, Microsoft is celebrating the launch of IE9 with a contest where you can submit an HTML5 game (or audio application) and have a chance to win a whack of cash and other prizes. Head here for the full scoop and rules for the DevUnplugged. This post also appears at http://david.wes.st

    Read the article

  • Silverlight Cream for March 26, 2010 -- #821

    - by Dave Campbell
    In this Issue: Max Paulousky, Christian Schormann, John Papa, Phani Raj, David Anson(-2-, -3-), Brad Abrams(-2-), and Jeff Wilcox(-2-, -3-). Shoutouts: Jeff Wilcox posted his material from mix and some preview TestFramework bits: Unit Testing Silverlight & Windows Phone Applications – talk now online At MIX10, Jeff Wilcox demo'd an app called "Peppermint"... here's the bleeding edge demo: “Peppermint” MIX demo sources Erik Mork and Co. have put out their weekly This Week In Silverlight 3.25.2010 Brad Abrams has all his materials posted for his MIX10 session Mix2010: Search Engine Optimization (SEO) for Microsoft Silverlight... including play-by-play of the demo and all source. Do you use Rooler? Well you should! Watch a video Pete Brown did with Pete Blois on Expression Blend, Windows Phone, Rooler Interested in Silverlight and XNA for WP7? Me too! Michael Klucher has a post outlining the two: Silverlight and XNA Framework Game Development and Compatibility From SilverlightCream.com: Modularity in Silverlight Applications - An Issue With ModuleInitializeException Max Paulousky has a truly ugly error trace listed by way of not having a reference listed, and the obvious simple solution. Next time he'll talk about the difficult situations. Using SketchFlow to Prototype for Windows Phone Christian Schormann has a tutorial up on using Expression Blend to develop for WP7 ... who better than Christian for that task?? Silverlight TV 18: WCF RIA Services Validation John Papa held forth with Nikhil Kothari on WCF RIA Services and validation just prior to MIX10, and was posted yesterday. Building SL3 applications using OData client Library with Vs 2010 RC Phani Raj walks through building an OData consumer in SL3, the first problem you're going to hit, and the easy solution to it. Tip: When creating a DependencyProperty, follow the handy convention of "wrapper+register+static+virtual" David Anson has a couple more of his 'Tips' up... this first is about Dependency Properties again... having a good foundation for all your Dependency Properties is a great way to avoid problems. Tip: Do not assign DependencyProperty values in a constructor; it prevents users from overriding them In the next post, David Anson talks about not assigning Dependency Property values in a constructor and gives one of the two ways to get around doing so. Tip: Set DependencyProperty default values in a class's default style if it's more convenient In his latest post, David Anson gives the second way to avoid setting a Dependency Property value in the constructor. Silverlight 4 + RIA Services - Ready for Business: Search Engine Optimization (SEO) Brad Abrams Abrams adds SEO to the tutorial series he's doing. He begins with his PDC09 session material on the subject and then takes off on a great detailed tutorial all with source. Silverlight 4 + RIA Services - Ready for Business: Localizing Business Application Brad Abrams then discusses localization and Silverlight in another detailed tutorial with all code included. Silverlight Toolkit and the Windows Phone: WrapPanel, and a few others Jeff Wilcox has a few WP7 posts I'm going to push today. This first is from earlier this week and is about using the Toolkit in WP7 and better than that, he includes the bits you need if all you want is the WrapPanel Data binding user settings in Windows Phone applications In the next one from yesterday, Jeff Wilcox demonstrates saving some user info in Isolated Storage to improve the user experience, and shares all the necessary plumbing files, and other external links as well. Displaying 2D QR barcodes in Windows Phone applications In a post from today, Jeff Wilcox ported his Silverlight 2D QR Barcode app from last year into WP7 ... just very cool... get the source and display your Microsoft Tag. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone    MIX10

    Read the article

  • Silverlight Cream for May 13, 2010 -- #861

    - by Dave Campbell
    In this Issue: Sigurd Snørteland, Jeff Prosise, DaveDev, Joe Zhou, Chris Eargle, John Papa(-2-, -3-), and David Anson(-2-). Shoutouts: In with the links I've listed below, Sigurd Snørteland also sent a link to this app he's working on which is actually pretty cool to see: ZuneLight. The code is not yet available. He also has a no-code demo of a Silverlight Media Center Pieter Voloshyn, Luiz Thadeu, and Jhun Iti have a very nice Silverlight image editor up: Thumba From SilverlightCream.com: WP7 - Silverlight on mobile Sigurd Snørteland submitted some links for me that have been translated to English from his blog. I hope the pages come out good because he's got a lot of good stuff on there. This one has a link to a presentation he did, and 4 projects you can load up in the emulator that he's converted to the phone: weather, worldclock, coverflow, and solitaire ... pretty cool... thanks for the links Sigurd! Understanding Page Orientation in Silverlight for Windows Phone Jeff Prosise has a really nice post up on page orientation in WP7 ... what it means to your app, how to detect it, and example code for what to do then... also love a quote by Jeff: "Silverlight for Windows Phone is the hottest thing since color TV" Why you should check out Expression Blend Behaviors when using Silverlight DaveDev has a post up describing Behaviors and why we should use them, plus tons of external links to resources, blogs, videos... all good stuff... Fiddler inspector for WCF Silverlight Polling Duplex and WCF RIA Joe Zhou announces and provides a link to a new Fiddler inspector that understands the framing in Polling Duplex and also raw binary xml and binary SOAP. Windows Phone Controls v0.7 Chris Eargle reports the release of Version 0.7 of the Windows Phone Controls project on CodePlex ... this includes a Pivot Control and a Panorama Control... both very nicely done. Binding to Silverlight ComboBox and Using SelectedValue, SelectedValuePath and DisplayMemberPath John Papa responds to a user question and put up a nice post about binding to a ComboBox and then go from the selected item to some other property ... code included No More Boxes! Exploring the PathListBox (Silverlight TV #25) Silverlight TV 25 went up on Tuesday ... thought it was going to be Thursday?? anyway ... John Papa and Adam Kinney are discussing the PathListBox and looking at some cool demos thereof. Exposing SOAP, OData, and JSON Endpoints for RIA Services (Silverlight TV 26) Since today IS Thursday, we have a new Silverlight TV, number 26, and John Papa is chatting with Deepesh Mohnani of the WCF RIA Services team about exposing all sorts of endpoints... should be something in there for everybody :) Workaround for a Silverlight data binding bug affecting various scenarios - including DataGrid+ContextMenu David Anson details the rabbit-trail he and others on the team followed in response to a problem reported via Twitter where the binding on a DataGrid seemed off by a row(!) ... weird but true, validated, and SL3/4 are bug-for-bug compatible with this too! ... But David wouldn't leave us there.. he also has a workaround. Sharing the code for a simple Silverlight 4 REST-based cloud-oriented file management app for Azure and S3 David Anson had an opportunity to build an app he's wanted to build for a while and shares it with us: Blobstore -- a small, lightweight Silverlight 4 application that acts as a basic front-end for the Windows Azure Simple Data Storage and the Amazon Simple Storage Service (S3) -- and remember I said he shared the source :) Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you’ll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you’ve read my previous blog posts, you’ll be aware that I’ve been focusing on the database continuous integration theme. In my CI setup I create a “production”-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it’s not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn’t I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn’t an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley’s “Continuous Delivery” teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you’ve been allotted. 2. It’s not just about the storage requirements, it’s also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I’m just not going to get the feedback quickly enough to react. So what’s the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I’m sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server’s point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no ‘duplicate’ storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly “release test” process triggered by my CI tool. RESTORE DATABASE WidgetProduction_Virtual FROM DISK=N'D:\VirtualDatabase\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE WidgetProduction_Virtual WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the ‘virtual’ restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • Installing Catalyst 11.6 for an ATI HD 6970

    - by David Oliver
    Ubuntu Maverick 10.10 is displaying the desktop okay (though limited to 1600x1200) after my having installed my new HD 6970 card, so I'm now trying to install the proprietary driver (I understand the open source one requires a more recent kernel than that in Maverick). The proprietary driver under 'Additional Drivers' resulted in a black screen on boot, so I deactivated and am trying to follow the manual install instructions at the cchtml Ubuntu Maverick Installation Guide. When I try to create the .deb packages with: sh ati-driver-installer-11-6-x86.x86_64.run --buildpkg Ubuntu/maverick I get: david@skipper:~/catalyst11.6$ sh ati-driver-installer-11-6-x86.x86_64.run --buildpkg Ubuntu/maverick Created directory fglrx-install.oLN3ux Verifying archive integrity... All good. Uncompressing ATI Catalyst(TM) Proprietary Driver-8.861......................... ===================================================================== ATI Technologies Catalyst(TM) Proprietary Driver Installer/Packager ===================================================================== Generating package: Ubuntu/maverick Package build failed! Package build utility output: ./packages/Ubuntu/ati-packager.sh: 396: debclean: not found dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.64Vzxk dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Cleaning in directory . /usr/bin/fakeroot: line 176: debian/rules: Permission denied debuild: fatal error at line 1319: couldn't exec fakeroot debian/rules: dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.QEmIld dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Cleaning in directory . Can't exec "debian/rules": Permission denied at /usr/bin/debuild line 1314. debuild: fatal error at line 1313: couldn't exec debian/rules: Permission denied dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.xtY6vC dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Cleaning in directory . /usr/bin/fakeroot: line 176: debian/rules: Permission denied debuild: fatal error at line 1319: couldn't exec fakeroot debian/rules: dpkg-buildpackage: export CFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export CPPFLAGS from dpkg-buildflags (origin: vendor): dpkg-buildpackage: export CXXFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export FFLAGS from dpkg-buildflags (origin: vendor): -g -O2 dpkg-buildpackage: export LDFLAGS from dpkg-buildflags (origin: vendor): -Wl,-Bsymbolic-functions dpkg-buildpackage: source package fglrx-installer dpkg-buildpackage: source version 2:8.861-0ubuntu1 dpkg-buildpackage: source changed by ATI Technologies Inc. <http://ati.amd.com/support/driver.html> dpkg-source --before-build fglrx.oYWICI dpkg-buildpackage: host architecture amd64 debian/rules build Can't exec "debian/rules": Permission denied at /usr/bin/dpkg-buildpackage line 507. dpkg-buildpackage: error: debian/rules build failed with unknown exit code -1 Removing temporary directory: fglrx-install.oLN3ux I've installed devscripts which has debclean in it. I've tried running the command with and without sudo. I'm not experienced with installing from downloads/source, but it seems like the file debian/source isn't being set to be executable when it needs to be. If I extract only, without using the package builder command, debian/rules is 744. As to what to do next, I'm stumped. Many thanks.

    Read the article

  • How to restore your production database without needing additional storage

    - by David Atkinson
    Production databases can get very large. This in itself is to be expected, but when a copy of the database is needed the database must be restored, requiring additional and costly storage.  For example, if you want to give each developer a full copy of your production server, you'll need n times the storage cost for your n-developer team. The same is true for any test databases that are created during the course of your project lifecycle. If you've read my previous blog posts, you'll be aware that I've been focusing on the database continuous integration theme. In my CI setup I create a "production"-equivalent database directly from its source control representation, and use this to test my upgrade scripts. Despite this being a perfectly valid and practical thing to do as part of a CI setup, it's not the exact equivalent to running the upgrade script on a copy of the actual production database. So why shouldn't I instead simply restore the most recent production backup as part of my CI process? There are two reasons why this would be impractical. 1. My CI environment isn't an exact copy of my production environment. Indeed, this would be the case in a perfect world, and it is strongly recommended as a good practice if you follow Jez Humble and David Farley's "Continuous Delivery" teachings, but in practical terms this might not always be possible, especially where storage is concerned. It may just not be possible to restore a huge production database on the environment you've been allotted. 2. It's not just about the storage requirements, it's also the time it takes to do the restore. The whole point of continuous integration is that you are alerted as early as possible whether the build (yes, the database upgrade script counts!) is broken. If I have to run an hour-long restore each time I commit a change to source control I'm just not going to get the feedback quickly enough to react. So what's the solution? Red Gate has a technology, SQL Virtual Restore, that is able to restore a database without using up additional storage. Although this sounds too good to be true, the explanation is quite simple (although I'm sure the technical implementation details under the hood are quite complex!) Instead of restoring the backup in the conventional sense, SQL Virtual Restore will effectively mount the backup using its HyperBac technology. It creates a data and log file, .vmdf, and .vldf, that becomes the delta between the .bak file and the virtual database. This means that both read and write operations are permitted on a virtual database as from SQL Server's point of view it is no different from a conventional database. Instead of doubling the storage requirements upon a restore, there is no 'duplicate' storage requirements, other than the trivially small virtual log and data files (see illustration below). The benefit is magnified the more databases you mount to the same backup file. This technique could be used to provide a large development team a full development instance of a large production database. It is also incredibly easy to set up. Once SQL Virtual Restore is installed, you simply run a conventional RESTORE command to create the virtual database. This is what I have running as part of a nightly "release test" process triggered by my CI tool. RESTORE DATABASE WidgetProduction_virtual FROM DISK=N'C:\WidgetWF\ProdBackup\WidgetProduction.bak' WITH MOVE N'WidgetProduction' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_WidgetProduction_Virtual.vmdf', MOVE N'WidgetProduction_log' TO N'C:\WidgetWF\ProdBackup\WidgetProduction_log_WidgetProduction_Virtual.vldf', NORECOVERY, STATS=1, REPLACE GO RESTORE DATABASE mydatabase WITH RECOVERY   Note the only change from what you would do normally is the naming of the .vmdf and .vldf files. SQL Virtual Restore intercepts this by monitoring the extension and applies its magic, ensuring the 'virtual' restore happens rather than the conventional storage-heavy restore. My automated release test then applies the upgrade scripts to the virtual production database and runs some validation tests, giving me confidence that were I to run this on production for real, all would go smoothly. For illustration, here is my 8Gb production database: And its corresponding backup file: Here are the .vldf and .vmdf files, which represent the only additional used storage for the new database following the virtual restore.   The beauty of this product is its simplicity. Once it is installed, the interaction with the backup and virtual database is exactly the same as before, as the clever stuff is being done at a lower level. SQL Virtual Restore can be downloaded as a fully functional 14-day trial. Technorati Tags: SQL Server

    Read the article

  • how do block websites using Ruckus ZoneDirector

    - by David A. Moody
    In my school we use Ruckus ZoneDirector to control our entire network. I have separate WLANs for faculty, elementary, and secondary. The elementary and secondary networks are set to go offline during recess/lunch breaks, and after school hours. This is working fine. What I need to be able to do is block Youtube access to students while leaving it accessible to teachers (faculty WLAN). Is it possible to do this? Thanks in advance. David.

    Read the article

  • Lenovo DVD Drive Disabled After Windows 7 Install

    - by David Lacher
    Upgraded hard drive in Lenovo T61P; decided to start fresh with Windows 7 Pro. Windows installed, so DVD drive was working. All of a sudden, driver is not recognized. Device is "HL-DT-ST DVDRAM GSA-U10N ATA Device". It appears on device manager but with the yellow tag; have tried uninstalling, searching for drivers, everything I can think of. Cannot even start over with Windows 7 installation disk because disk spins but then stops and My Computer does not recognize the drive. Help please. thank you. David Lacher

    Read the article

  • Database continuous integration step by step

    - by David Atkinson
    This post will describe how to set up basic database continuous integration using TeamCity to initiate the build process, SQL Source Control to put your database under source control, and the SQL Compare command line to keep a test database up to date. In my example I will be using Subversion as my source control repository. If you wish to follow my steps verbatim, please make sure you have TortoiseSVN, SQL Compare and SQL Source Control installed. Downloading and Installing TeamCity TeamCity (http://www.jetbrains.com/teamcity/index.html) is free for up to three agents, so it a great no-risk tool you can use to experiment with. 1. Download the latest version from the JetBrains website. For some reason the TeamCity executable didn't download properly for me, stalling frustratingly at 99%, so I tried again with the zip file download option (see screenshot below), which worked flawlessly. 2. Run the installer using the defaults. This results in a set-up with the server component and agent installed on the same machine, which is ideal for getting started with ease. 3. Check that the build agent is pointing to the server correctly. This has caught me out a few times before. This setting is in C:\TeamCity\buildAgent\conf\buildAgent.properties and for my installation is serverUrl=http\://localhost\:80 . If you need to change this value, if for example you've had to install the Server console to a different port number, the TeamCity Build Agent Service will need to be restarted for the change to take effect. 4. Open the TeamCity admin console on http://localhost , and specify your own designated username and password at first startup. Putting your database in source control using SQL Source Control 5. Assuming you've got SQL Source Control installed, select a development database in the SQL Server Management Studio Object Explorer and select Link Database to Source Control. 6. For the Link step you can either create your own empty folder in source control, or you can select Just Evaluating, which just creates a local subversion repository for you behind the scenes. 7. Once linked, note that your database turns green in the Object Explorer. Visit the Commit tab to do an initial commit of your database objects by typing in an appropriate comment and clicking Commit. 8. There is a hidden feature in SQL Source Control that opens up TortoiseSVN (provided it is installed) pointing to the linked repository. Keep Shift depressed and right click on the text to the right of 'Linked to', in the example below, it's the red Evaluation Repository text. Select Open TortoiseSVN Repo Browser. This screen should give you an idea of how SQL Source Control manages the object files behind the scenes. Back in the TeamCity admin console, we'll now create a new project to monitor the above repository location and to trigger a 'build' each time the repository changes. 9. In TeamCity Adminstration, select Create Project and give it a name, such as "My first database CI", and click Create. 10. Click on Create Build Configuration, and name it something like "Integration build". 11. Click VCS settings and then Create And Attach new VCS root. This is where you will tell TeamCity about the repository it should monitor. 12. In my case since I'm using the Just Evaluating option in SQL Source Control, I should select Subversion. 13. In the URL field paste your repository location. In my case this is file:///C:/Users/David.Atkinson/AppData/Local/Red Gate/SQL Source Control 3/EvaluationRepositories/WidgetDevelopment/WidgetDevelopment 14. Click on Test Connection to ensure that you can communicate with your source control system. Click Save. 15. Click Add Build Step, and Runner Type: Command Line. Should you be familiar with the other runner types, such as NAnt, MSBuild or Powershell, you can opt for these, but for the same of keeping it simple I will pick the simplest option. 16. If you have installed SQL Compare in the default location, set the Command Executable field to: C:\Program Files (x86)\Red Gate\SQL Compare 10\sqlcompare.exe 17. Flip back to SSMS briefly and add a new database to your server. This will be the database used for continuous integration testing. 18. Set the command parameters according to your server and the name of the database you have created. In my case I created database RedGateCI on server .\sql2008r2 /scripts1:. /server2:.\sql2008r2 /db2:RedGateCI /sync /verbose Note that if you pick a server instance that isn't on your local machine, you'll need the TCP/IP protocol enabled in SQL Server Configuration Manager otherwise the SQL Compare command line will not be able to connect. 19. Save and select Build Triggering / Add New Trigger / VCS Trigger. This is where you tell TeamCity when it should initiate a build. Click Save. 20. Now return to SQL Server Management Studio and make a schema change (eg add a new object) to your linked development database. A blue indicator will appear in the Object Explorer. Commit this change, typing in an appropriate check-in comment. All being good, within 60 seconds (a TeamCity default that can be changed) a build will be triggered. 21. Click on Projects in TeamCity to get back to the overview screen: The build log will show you the console output, which is useful for troubleshooting any issues: That's it! You now have continuous integration on your database. In future posts I'll cover how you can generate and test the database creation script, the database upgrade script, and run database unit tests as part of your continuous integration script. If you have any trouble getting this up and running please let me know, either by commenting on this post, or email me directly using the email address below. Technorati Tags: SQL Server

    Read the article

  • Lenovo DVD Drive Disabled After Windows 7 Install

    - by David Lacher
    Upgraded hard drive in Lenovo T61P; decided to start fresh with Windows 7 Pro. Windows installed, so DVD drive was working. All of a sudden, driver is not recognized. Device is "HL-DT-ST DVDRAM GSA-U10N ATA Device". It appears on device manager but with the yellow tag; have tried uninstalling, searching for drivers, everything I can think of. Cannot even start over with Windows 7 installation disk because disk spins but then stops and My Computer does not recognize the drive. Help please. thank you. David Lacher

    Read the article

  • Master Data Management and Cloud Computing

    - by david.butler(at)oracle.com
    Cloud Computing is all the rage these days. There are many reasons why this is so. But like its predecessor, Service Oriented Architecture, it can fall on hard times if the underlying data is left unmanaged. Master Data Management is the perfect Cloud companion. It can materially increase the chances for successful Cloud initiatives. In this blog, I'll review the nature of the Cloud and show how MDM fits in.   Here's the National Institute of Standards and Technology Cloud definition: •          Cloud computing is a model for enabling convenient, on-demand network access to a shared pool of configurable computing resources that can be rapidly provisioned and released with minimal management effort or service provider interaction.   Cloud architectures have three main layers: applications or Software as a Service (SaaS), Platforms as a Service (PaaS), and Infrastructure as a Service (IaaS). SaaS generally refers to applications that are delivered to end-users over the Internet. Oracle CRM On Demand is an example of a SaaS application. Today there are hundreds of SaaS providers covering a wide variety of applications including Salesforce.com, Workday, and Netsuite. Oracle MDM applications are located in this layer of Oracle's On Demand enterprise Cloud platform. We call it Master Data as a Service (MDaaS). PaaS generally refers to an application deployment platform delivered as a service. They are often built on a grid computing architecture and include database and middleware. Oracle Fusion Middleware is in this category and includes the SOA and Data Integration products used to connect SaaS applications including MDM. Finally, IaaS generally refers to computing hardware (servers, storage and network) delivered as a service.  This typically includes the associated software as well: operating systems, virtualization, clustering, etc.    Cloud Computing benefits are compelling for a large number of organizations. These include significant cost savings, increased flexibility, and fast deployments. Cost advantages include paying for just what you use. This is especially critical for organizations with variable or seasonal usage. Companies don't have to invest to support peak computing periods. Costs are also more predictable and controllable. Increased agility includes access to the latest technology and experts without making significant up front investments.   While Cloud Computing is certainly very alluring with a clear value proposition, it is not without its challenges. An IDC survey of 244 IT executives/CIOs and their line-of-business (LOB) colleagues identified a number of issues:   Security - 74% identified security as an issue involving data privacy and resource access control. Integration - 61% found that it is hard to integrate Cloud Apps with in-house applications. Operational Costs - 50% are worried that On Demand will actually cost more given the impact of poor data quality on the rest of the enterprise. Compliance - 49% felt that compliance with required regulatory, legal and general industry requirements (such as PCI, HIPAA and Sarbanes-Oxley) would be a major issue. When control is lost, the ability of a provider to directly manage how and where data is deployed, used and destroyed is negatively impacted.  There are others, but I singled out these four top issues because Master Data Management, properly incorporated into a Cloud Computing infrastructure, can significantly ameliorate all of these problems. Cloud Computing can literally rain raw data across the enterprise.   According to fellow blogger, Mike Ferguson, "the fracturing of data caused by the adoption of cloud computing raises the importance of MDM in keeping disparate data synchronized."   David Linthicum, CTO Blue Mountain Labs blogs that "the lack of MDM will become more of an issue as cloud computing rises. We're moving from complex federated on-premise systems, to complex federated on-premise and cloud-delivered systems."    Left unmanaged, non-standard, inconsistent, ungoverned data with questionable quality can pollute analytical systems, increase operational costs, and reduce the ROI in Cloud and On-Premise applications. As cloud computing becomes more relevant, and more data, applications, services, and processes are moved out to cloud computing platforms, the need for MDM becomes ever more important. Oracle's MDM suite is designed to deal with all four of the above Cloud issues listed in the IDC survey.   Security - MDM manages all master data attribute privacy and resource access control issues. Integration - MDM pre-integrates Cloud Apps with each other and with On Premise applications at the data level. Operational Costs - MDM significantly reduces operational costs by increasing data quality, thereby improving enterprise business processes efficiency. Compliance - MDM, with its built in Data Governance capabilities, insures that the data is governed according to organizational standards. This facilitates rapid and accurate reporting for compliance purposes. Oracle MDM creates governed high quality master data. A unified cleansed and standardized data view is produced. The Oracle Customer Hub creates a single view of the customer. The Oracle Product Hub creates high quality product data designed to support all go-to-market processes. Oracle Supplier Hub dramatically reduces the chances of 'supplier exceptions'. Oracle Site Hub masters locations. And Oracle Hyperion Data Relationship Management masters financial reference data and manages enterprise hierarchies across operational areas from ERP to EPM and CRM to SCM. Oracle Fusion Middleware connects Cloud and On Premise applications to MDM Hubs and brings high quality master data to your enterprise business processes.   An independent analyst once said "Poor data quality is like dirt on the windshield. You may be able to drive for a long time with slowly degrading vision, but at some point, you either have to stop and clear the windshield or risk everything."  Cloud Computing has the potential to significantly degrade data quality across the enterprise over time. Deploying a Master Data Management solution prior to or in conjunction with a move to the Cloud can insure that the data flowing into the enterprise from the Cloud is clean and governed. This will in turn insure that expected returns on the investment in Cloud Computing will be realized.       Oracle MDM has proven its metal in this area and has the customers to back that up. In fact, I will be hosting a webcast on Tuesday, April 10th at 10 am PT with one of our top Cloud customers, the Church Pension Group. They have moved all mainline applications to a hosted model and use Oracle MDM to insure the master data is managed and cleansed before it is propagated to other cloud and internal systems. I invite you join Martin Hossfeld, VP, IT Operations, and Danette Patterson, Enterprise Data Manager as they review business drivers for MDM and hosted applications, how they did it, the benefits achieved, and lessons learned. You can register for this free webcast here.  Hope to see you there.

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >