Search Results

Search found 24149 results on 966 pages for 'visual studio package'.

Page 537/966 | < Previous Page | 533 534 535 536 537 538 539 540 541 542 543 544  | Next Page >

  • A debugging experience with "highly compatible" ASP.NET 4.5

    - by Jeff
    I have to admit that I will pretty much upgrade software for no reason other than being on the latest version. I won't do it if it's super expensive (Adobe gets money from me about once every three or four years at best), but particularly with frameworks and stuff generally available as part of my MSDN subscription, I'll be bleeding edge. CoasterBuzz was running on the MVC 4 framework pretty much as soon as they did a "go live" license for it. I didn't really jump in head-first with Windows 8 and Visual Studio 2012, in part because I just wasn't interested in doing the reinstalls for each new version. Turns out there weren't that many revisions anyway. But when the final versions were released a week and a half ago, I jumped in. I saw on one of the Microsoft sites that .Net 4.5 was a "highly compatible in-place update" to the framework. Good enough for me. I was obviously running it by default in Windows 8, and installed it on my production server. I suppose it's "highly compatible," except when it isn't. Three of my sites are running with various flavors of the MVC version of POP Forums. All of them stopped working under ASP.NET 4.5. It was not immediately obvious what the problem might be beyond an exception indicating that there were no repository classes registered with Ninject, which I use for dependency injection in the forums. This was made all the more weird by the fact that it ran fine locally in the dev Web host. My first instinct was to spin up a Windows Server VM on my local box and put the remote debugger on it. (Side note: running multiple VM's on a Retina MacBook Pro with 16 gigs of RAM is pretty much the most awesome thing ever. I can't believe this computer is for real, and not a 50-pound tower under my desk.) What might have been going on in IIS that doesn't happen in Visual Studio? In the debugging process, I realized that I might be looking in the wrong place. POP Forums creates a Ninject container using a method called from a PreApplicationStartMethod attribute, and at that time registers a module (what Ninject uses to map interfaces to implementations) that maps all of the core dependencies. It also creates an instance of an HttpModule that originally hosted the "services" (search indexing, mailer, etc.), but now just records errors. That's all well and good, but the actual repository mapping, where data is actually read or persisted, happens in Application_Start() in global.asax. The idea there is that you can swap out the SqlSingleWebServer repos for something tuned for multiple servers, Oracle or something else. Of course, if I used something like StructureMap, which does convention-based mapping for dependency injection (a class implementing ISettingsRepository called SettingsRepository is automagically mapped), I wouldn't have to worry about it. In any case, the HttpModule, being instantiated before Application_Start() gets to run, would throw because there was no repo mapped where it could get settings from the database. This makes total sense. The fix is sort of a hack, where I don't setup the innards of the HttpModule until a call to its BeginRequest is made. I say it's a hack, because its primary function, logging exceptions, won't work until the app has warmed up. Still, this brings up an interesting question about the race condition, and what changed in 4.5 when it's running in IIS. In ASP.NET 4, it would appear that the code called via the PreApplicationStartMethod was either failing silently, and running again later, or it was getting to that code after Application_Start was called. In any case, weird thing. The real pain point I'm experiencing now is a bug in MVC 4 that is extremely serious because it renders the mobile/alternate view functionality very much broken.

    Read the article

  • Travelling MVP #4: DevReach 2012

    - by DigiMortal
    Our next stop after Varna was Sofia where DevReach happens. DevReach is one of my favorite conferences in Europe because of sensible prices and strong speakers line-up. Also they have VIP-party after conference and this is good event to meet people you don’t see every day, have some discussion with speakers and find new friends. Our trip from Varna to Sofia took about 6.5 hours on bus. As I was tired from last evening it wasn’t problem for me as I slept half the trip. After smoking pause in Velike Tarnovo I watched movies from bus TV. We had supper later in city center Happy’s – place with good meat dishes and nice service. And next day it begun…. :) DevReach 2012 DevReach is held usually in Arena Mladost. It’s near airport and Telerik office. The event is organized by local MVP Martin Kulov together with Telerik. Two days of sessions with strong speakers is good reason enough for me to go to visit some event. Some topics covered by sessions: Windows 8 development web development SharePoint Windows Azure Windows Phone architecture Visual Studio Practically everybody can find some interesting session in every time slot. As the Arena is not huge it is very easy to go from one sessions to another if selected session for time slot is not what you expected. On the second floor of Arena there are many places where you can eat. There are simple chunk-food places like Burger King and also some restaurants. If you are hungry you will find something for your taste for sure. Also you can buy beer if it is too hot outside :) Weather was very good for October – practically Estonian summer – 25C and over. Sessions I visited Here is the list of sessions I visited at DevReach 2012: DevReach 2012 Opening & Welcome Messsage with Martin Kulov and Stephen Forte Principled N-Tier Solution Design with Steve Smith Data Patterns for the Cloud with Brian Randell .NET Garbage Collection Performance Tips with Sasha Goldshtein Building Secured, Scalable, Low-latency Web Applications with the Windows Azure Platform with Ido Flatow It’s a Knockout! MVVM Style Web Applications with Charles Nurse Web Application Architecture – Lessons Learned from Adobe Brackets with Brian Rinaldi Demystifying Visual Studio 2012 Performance Tools with Martin Kulov SPvNext – A Look At All the Exciting And New Features In SharePoint with Sahil Malik Portable Libraries – Why You Should Care with Lino Tadros I missed some sessions because of some death march projects that are going and that I have to coordinate but it was not big loss as I had time to walk around in session venue neighborhood and see Sofia Business Park. Next year again! I will be there again next year and hopefully more guys from Estonia will join me. I think it’s good idea to take short vacation for DevReach time and do things like we did this time – Bucharest, Varna, Sofia. It’s only good idea to plan some more free time so we are not very much in hurry and also we have no work stuff to do on the trip. This far this trip has been one of best trips I have organized and I will go and meet all those guys in this region again! :)

    Read the article

  • How to I do install DB2 ODBC?

    - by Justin
    I have been trying, with no success, to install a IBM DB2 ODBC driver so that my PHP server can connect to a database. I've tried installing the db2_connect and get all sorts of problems, I tried install I Access for Linux and the RPM did not install right nor did using alien breed any useful results. I've also tried the DB2 Runtime v8.1, no success. If I attempt to run the rpm it claims I need dependencies that I can't find in apt-get. Yum is also not very helpful as it appears I don't have any repositories installed or lists... Running the simple RPM gives me this result in terminal: # rpm -ivh iSeriesAccess-7.1.0-1.0.x86_64.rpm rpm: RPM should not be used directly install RPM packages, use Alien instead! rpm: However assuming you know what you are doing... error: Failed dependencies: /bin/ln is needed by iSeriesAccess-7.1.0-1.0.x86_64 /sbin/ldconfig is needed by iSeriesAccess-7.1.0-1.0.x86_64 /bin/rm is needed by iSeriesAccess-7.1.0-1.0.x86_64 /bin/sh is needed by iSeriesAccess-7.1.0-1.0.x86_64 libc.so.6()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libc.so.6(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libc.so.6(GLIBC_2.3)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libdl.so.2()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libdl.so.2(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libgcc_s.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libm.so.6()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libm.so.6(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libodbcinst.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libodbc.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libpthread.so.0()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libpthread.so.0(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libpthread.so.0(GLIBC_2.3.2)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 librt.so.1()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 librt.so.1(GLIBC_2.2.5)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libstdc++.so.6()(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libstdc++.so.6(CXXABI_1.3)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 libstdc++.so.6(GLIBCXX_3.4)(64bit) is needed by iSeriesAccess-7.1.0-1.0.x86_64 Using alien and running the dkpg gives me thes headaque: $ alien iSeriesAccess-7.1.0-1.0.x86_64.rpm --scripts # dpkg -i iseriesaccess_7.1.0-2_amd64.deb (Reading database ... 127664 files and directories currently installed.) Preparing to replace iseriesaccess 7.1.0-2 (using iseriesaccess_7.1.0-2_amd64.deb) ... Unpacking replacement iseriesaccess ... post uninstall processing for iSeriesAccess 1.0...upgrade /var/lib/dpkg/info/iseriesaccess.postrm: line 8: [: upgrade: integer expression expected Setting up iseriesaccess (7.1.0-2) ... post install processing for iSeriesAccess 1.0...configure iSeries Access ODBC Driver has been deleted (if it existed at all) because its usage count became zero odbcinst: Driver installed. Usage count increased to 1. Target directory is /etc odbcinst: Driver installed. Usage count increased to 3. Target directory is /etc Processing triggers for libc-bin ... ldconfig deferred processing now taking place So it seems the files installed right, well my odbc driver shows up but db2cli.ini is no where to be found. So several questions. Is there a better alternative to connect php to db2, say an ubuntu package I can just install? Can someone direct me to the steps that makes my ubuntu server works well with the RPM so I can build my db2 instance? Also remember I'm connection to an I Series remotely. I'm not using the DB2 Express C thing, even if I did try it to get the db2 php functions to work. And I don't have zend but I think I have every other package on the ubuntu repositories. Help, thank you!

    Read the article

  • Silverlight Cream for January 04, 2011 -- #1022

    - by Dave Campbell
    In this Issue: Dennis Doomen, Doug Holland, Kunal Chowdhury, Sacha Barber, Paul Sheriff, Mike Snow(-2-), Peter Kuhn(-2-), and Mike Ormond. Above the Fold: Silverlight: "Silverlight: Fixing the BookShelf Sample" Peter Kuhn WP7: "Searching the Windows Phone 7 Marketplace Programmatically" Doug Holland Prism/Cinch: "PRISM 4 Custom Transitioning Region" Sacha Barber Shoutouts: Sacha Barber the author of Cinch asks for some advice from users: Cinch V2 : Question For The Reader Michael Crump introduces us to SnippetManager as a way to organize your Silverlight snippets... I'm thinking any snippet: A better way to organize your Silverlight Code Snippets. Andy Beaulieu announced an update of Physics Helper 4.2 using Farseer 3.2 ... check out the breaking changes though! Dennis Doomen blogged about a new release of his Fluent Assertions: A new year with a new release of Fluent Assertions, with a blog post about it below From SilverlightCream.com: Verifying PropertyChanged events in Silverlight using Fluent Assertions Dennis Doomen release his latest Fluent Assertions for .NET and Silverlight and wrote up a big post about the new event monitoring syntax. Searching the Windows Phone 7 Marketplace Programmatically Doug Holland has a post up on MSDN blogs talking about searching the WP7 Marketplace programmatically... ya know you should be able to do it... here's how. Beginners Guide to Visual Studio LightSwitch (Part - 5) Kunal Chowdhury has Part 5 of a tutorial series on Lightswitch up at SilverlightShow... working with custom validation this time, and for the first time in this series so far actually writes some code! PRISM 4 Custom Transitioning Region Sacha Barber took time to look at Prism4/MEF and Cinch2 and found things to be fine then wrote a custom PRISM region adaptor that uses a TransitionalElement from the Microsoft Transitionals project... code available, blog post to come. Get Application Title from Windows Phone Paul Sheriff has a cool chunk of code up... getting the Application's title programmatically... and other attributes as well, if you were wondering why you might wanna do that. Detecting Users Win7 Mobile Theme Color Mike Snow has a couple as well... first up is how to detect your user's theme... obviously useful if you wanna match it. Selecting an Item in a ComboBox after Adding Items Second for Mike Snow is a general Silverlight issue... setting the selected item on a ComboBox after filling it... if you haven't stumbled across this yet, you will... A Simplified Grid Markup Reloaded Peter Kuhn has a pair of posts up since last time... this first is an extension of Colin Eberhardt's simplified Grid markup system, but it's only useful if you don't plan on using Blend... can we get a show of hands? :) Silverlight: Fixing the BookShelf Sample Next Peter Kuhn has some changes to the Bookshelf code, but more importantly has some excelling tips about shader effects, Effects on Visual Elements and how to make best use of all the above. Displaying HTML Content in Windows Phone 7 Mike Ormond has a WP7 post up describing problems a customer had early on displaying rich text and an attempt to use the WebBrowser control to pull it off and the problems that caused... check out the resultant code, and read the comments as well. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Creating dynamic breadcrumb in asp.net mvc with mvcsitemap provider

    - by Jalpesh P. Vadgama
    I have done lots breadcrumb kind of things in normal asp.net web forms I was looking for same for asp.net mvc. After searching on internet I have found one great nuget package for mvpsite map provider which can be easily implemented via site map provider. So let’s check how its works. I have create a new MVC 3 web application called breadcrumb and now I am adding a reference of site map provider via nuget package like following. You can find more information about MVC sitemap provider on following URL. https://github.com/maartenba/MvcSiteMapProvid So once you add site map provider. You will find a Mvc.SiteMap file like following. And following is content of that file. <?xml version="1.0" encoding="utf-8" ?> <mvcSiteMap xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xmlns="http://mvcsitemap.codeplex.com/schemas/MvcSiteMap-File-3.0" xsi:schemaLocation="http://mvcsitemap.codeplex.com/schemas/MvcSiteMap-File-3.0 MvcSiteMapSchema.xsd" enableLocalization="true"> <mvcSiteMapNode title="Home" controller="Home" action="Index"> <mvcSiteMapNode title="About" controller="Home" action="About"/> </mvcSiteMapNode> </mvcSiteMap> So now we have added site map so now its time to make breadcrumb dynamic. So as we all know that with in the standard asp.net mvc template we have action link by default for Home and About like following. <div id="menucontainer"> <ul id="menu"> <li>@Html.ActionLink("Home", "Index", "Home")</li> <li>@Html.ActionLink("About", "About", "Home")</li> </ul> </div> Now I want to replace that with our sitemap provider and make it dynamic so I have added the following code. <div id="menucontainer"> @Html.MvcSiteMap().Menu(true) </div> That’s it. This is the magic code @Html.MvcSiteMap will dynamically create breadcrumb for you. Now let’s run this in browser. You can see that it has created breadcrumb dynamically without writing any action link code. So here you can see with MvcSiteMap provider we don’t have to write any code we just need to add menu syntax and rest it will do automatically. That’s it. Hope you liked it. Stay tuned for more till then happy programming.

    Read the article

  • Cross platform application revolution

    - by anirudha
    Every developer know that if they make a windows application that they work only on windows. that’s a small pity thing we all know. this is a lose point for windows application who make developer thing small means only for windows and other only for mac. this is a big point behind success of web because who purchase a operating system if they want to use a application on other platform. why they purchase when they can’t try them. that’s a thing better in Web means IE 6 no problem IE 6 to IE 8 chrome to chrome 8 Firefox to Firefox 3.6.13 even that’s beta no problem the good website is shown as same as other browser. some minor difference may be can see. the cross platform application development thinking is much big then making a application who is only for some audience. the difference between audience make by OS what they use Windows or mac. if they use mac they can’t use this they use windows they can’t use this. Web for Everyone starting from a children to grandfather. male and female Everyone can use internet.no worrying what you have even you have Windows or mac , any browser even as silly IE 6. the cross platform have a good thing that “People”. everyone can use them without a problem that. just like some time problem come in windows that “some component is missing click here to get them” , you can’t use this [apps] software because you have windows sp1 , sp2  sp3. you need to install this first before this. this stupidity mainly comes in Microsoft software. in last year i found a issue on WPI that they force user to install another software when they get them from WPI. ex:- you need to install Visual studio 2008 before installing Visual studio 2010 express. are anyone tell me why user get old version 2008 when they get latest and express version. i never try again their to check the issue is solved or not. a another thing is you can’t get IE 9 on windows XP version. in that’case don’t thing and worrying about them because Firefox and Chrome is much better. the stupidity from Microsoft is too much. they never told you about Firebug even sometime they discuss about damage tool in IE they called them developer tool because they are Microsoft and they only thing how they can market their products. you need to install many thing without any reason such as many SQL server component even you use other RDBMS. you can’t say no to them because you need a tool and tool require a useless component called SQL server. i never found any software force me to install this for this and this for this before install me. that’s another good thing in WEB that no thing require i means you not need to install dotnet framework 4 before enjoy facebook or twitter. may be you found out that Microsoft's fail project Window planet force you to get silverlight before going their. i never hear about them. some month ago my friend talked to me about them i found nothing better their. Wha’t user do when facebook force user to install silverlight or adobe flash or may be Microsoft dotnet framework 4. if you not install them facebook tell  you bye bye tata ! never come here before installing Microsoft dotnet framework 4. the door is open for you after installing them not before. the story is same as “ tell me sorry before coming in home” as mother says to their child when they do something wrong. the web never force you to do something for them. sometime they allow you to use other website account their that’s very fast login for you. because they know the importance of your time.

    Read the article

  • New January 2013 Release of the Ajax Control Toolkit

    - by Stephen.Walther
    I am super excited to announce the January 2013 release of the Ajax Control Toolkit! I have one word to describe this release and that word is “Charts” – we’ve added lots of great new chart controls to the Ajax Control Toolkit. You can download the new release directly from http://AjaxControlToolkit.CodePlex.com – or, just fire the following command from the Visual Studio Library Package Manager Console Window (NuGet): Install-Package AjaxControlToolkit You also can view the new chart controls by visiting the “live” Ajax Control Toolkit Sample Site. 5 New Ajax Control Toolkit Chart Controls The Ajax Control Toolkit contains five new chart controls: the AreaChart, BarChart, BubbleChart, LineChart, and PieChart controls. Here is a sample of each of the controls: AreaChart: BarChart: BubbleChart: LineChart: PieChart: We realize that people love to customize the appearance of their charts so all of the chart controls include properties such as color properties. The chart controls render the chart on the browser using SVG. The chart controls are compatible with any browser which supports SVG including Internet Explorer 9 and new and recent versions of Google Chrome, Mozilla Firefox, and Apple Safari. (If you attempt to display a chart on a browser which does not support SVG then you won’t get an error – you just won’t get anything). Updates to the HTML Sanitizer If you are using the HtmlEditorExtender on a public-facing website then it is really important that you enable the HTML Sanitizer to prevent Cross-Site Scripting (XSS) attacks. The HtmlEditorExtender uses the HTML Sanitizer by default. The HTML Sanitizer strips out any suspicious content (like JavaScript code and CSS expressions) from the HTML submitted with the HtmlEditorExtender. We followed the recommendations of OWASP and ha.ckers.org to identify suspicious content. We updated the HTML Sanitizer with this release to protect against new types of XSS attacks. The HTML Sanitizer now has over 220 unit tests. The Ajax Control Toolkit team would like to thank Gil Cohen who helped us identify and block additional XSS attacks. Change in Ajax Control Toolkit Version Format We ran out of numbers. The Ajax Control Toolkit was first released way back in 2006. In previous releases, the version of the Ajax Control Toolkit followed the format: Release Year + Date. So, the previous release was 60919 where 6 represented the 6th release year and 0919 represent September 19. Unfortunately, the AssembyVersion attribute uses a UInt16 data type which has a maximum size of 65,534. The number 70123 is bigger than 65,534 so we had to change our version format with this release. Fortunately, the AssemblyVersion attribute actually accepts four UInt16 numbers so we used another one. This release of the Ajax Control Toolkit is officially version 7.0123. This new version format should work for another 65,000 years. And yes, I realize that 7.0123 is less than 60,919, but we ran out of numbers. Summary I hope that you find the chart controls included with this latest release of the Ajax Control Toolkit useful. Let me know if you use them in applications that you build. And, let me know if you run into any issues using the new chart controls. Next month, back to improving the File Upload control – more exciting stuff.

    Read the article

  • Unable to apt-get upgrade in ubuntu 11.10

    - by blackhole
    These are the errors shows by different client Update Manager: Traceback (most recent call last): File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 968, in simulate trans.unauthenticated = self._simulate_helper(trans) File "/usr/lib/python2.7/dist-packages/aptdaemon/worker.py", line 1092, in _simulate_helper return depends, self._cache.required_download, \ File "/usr/lib/python2.7/dist-packages/apt/cache.py", line 235, in required_download pm.get_archives(fetcher, self._list, self._records) SystemError: E:Method has died unexpectedly!, E:Sub-process returned an error code (100), E:Method /usr/lib/apt/methods/ did not start correctly Synaptic package Manager E: Method has died unexpectedly! E: Sub-process returned an error code (100) E: Method /usr/lib/apt/methods/ did not start correctly E: Unable to lock the download directory Command: sudo apt-get upgrade Reading package lists... Done Building dependency tree Reading state information... Done The following packages will be upgraded: libfreetype6 libfreetype6-dev 2 upgraded, 0 newly installed, 0 to remove and 0 not upgraded. Failed to exec method /usr/lib/apt/methods/ E: Method has died unexpectedly! E: Sub-process returned an error code (100) E: Method /usr/lib/apt/methods/ did not start correctly Can anyone one tell me how to resolve these issues ? I have no volatile packages or anything so i am even posting the preview of my sources.list file. # deb cdrom:[Ubuntu 10.10 _Maverick Meerkat_ - Release i386 (20101007)]/ maverick main restricted # See http://help.ubuntu.com/community/UpgradeNotes for how to upgrade to # newer versions of the distribution. deb http://in.archive.ubuntu.com/ubuntu/ oneiric main restricted ## Major bug fix updates produced after the final release of the ## distribution. deb http://in.archive.ubuntu.com/ubuntu/ oneiric-updates main restricted ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team. Also, please note that software in universe WILL NOT receive any ## review or updates from the Ubuntu security team. deb http://in.archive.ubuntu.com/ubuntu/ oneiric universe deb http://in.archive.ubuntu.com/ubuntu/ oneiric-updates universe ## N.B. software from this repository is ENTIRELY UNSUPPORTED by the Ubuntu ## team, and may not be under a free licence. Please satisfy yourself as to ## your rights to use the software. Also, please note that software in ## multiverse WILL NOT receive any review or updates from the Ubuntu ## security team. deb http://in.archive.ubuntu.com/ubuntu/ oneiric multiverse deb http://in.archive.ubuntu.com/ubuntu/ oneiric-updates multiverse ## Uncomment the following two lines to add software from the 'backports' ## repository. ## N.B. software from this repository may not have been tested as ## extensively as that contained in the main release, although it includes ## newer versions of some applications which may provide useful features. ## Also, please note that software in backports WILL NOT receive any review ## or updates from the Ubuntu security team. # deb http://in.archive.ubuntu.com/ubuntu/ maverick-backports main restricted universe multiverse # deb-src http://in.archive.ubuntu.com/ubuntu/ maverick-backports main restricted universe multiverse ## Uncomment the following two lines to add software from Canonical's ## 'partner' repository. ## This software is not part of Ubuntu, but is offered by Canonical and the ## respective vendors as a service to Ubuntu users. deb http://archive.canonical.com/ubuntu oneiric partner deb-src http://archive.canonical.com/ubuntu oneiric partner ## This software is not part of Ubuntu, but is offered by third-party ## developers who want to ship their latest software. deb http://extras.ubuntu.com/ubuntu oneiric main deb-src http://extras.ubuntu.com/ubuntu oneiric main deb http://in.archive.ubuntu.com/ubuntu/ oneiric-security main restricted deb http://in.archive.ubuntu.com/ubuntu/ oneiric-security universe deb http://in.archive.ubuntu.com/ubuntu/ oneiric-security multiverse # deb http://archive.canonical.com/ lucid partner Here is the preview of my sources.list file

    Read the article

  • SQL SERVER – Discard Results After Query Execution – SSMS

    - by pinaldave
    The first thing I do any day is to turn on the computer. Today I woke up and as soon as I turned on the computer I saw a chat message from a friend. He was a bit confused and wanted me to help him. Just as usual I am keeping the relevant conversation in focus and documenting our conversation as chat. Let us call him Ajit. Ajit: Pinal, every time I run a query there is no result displayed in the SSMS but when I run the query in my application it works and returns an appropriate result. Pinal:  Have you tried with different parameters? Ajit: Same thing. However, it works from another computer when I connect to the same server with the same query parameters? Pinal: What? That is new and I believe it is something to do with SSMS and not with the server. Send me screenshot please. Ajit: I believe so, let me send you a screenshot, Pinal: (looking at the screenshot) Oh man, there is no result-tab at all. Ajit: That is what the problem is. It does not have the tab which displays the result. This works just fine from another computer. Pinal: Have you referred Nakul’s blog post – SSMS – Query result options – Discard result after query executes, that talks about setting which can discard the query results after execution. (After a while) Ajit: I think it seems like on the computer where I am running the query my SSMS seems to have the option enabled related to discarding results. I fixed it by following Nakul’s blog post. Pinal: Great! Quite often I get the question what is the importance of the feature. Let us first see how to turn on or turn off this feature in SQL Server Management Studio 2012. In SSMS 2012 go to Tools >> Options >> Query Results > SQL Server >> Results to Grid >> Discard Results After Query Execution. When enabled this option will discard results after the execution. The advantage of disabling the option is that it will improve the performance by using less memory. However the real question is why would someone enable or disable the option. What are the cases when someone wants to run the query but do not care about the result? Matter of the fact, it does not make sense at all to run query and not care about the result. The matter of the fact, I can see quite a few reasons for using this option. I often enable this option when I am doing performance tuning exercise. During performance tuning exercise when I am working with execution plans and do not need results to verify every time or when I am tuning Indexes and its effect on execution plan I do not need the results. In this kind of situations I do keep this option on and discard the results. It always helps me big time as in most of the performance tuning exercise I am dealing with huge amount of the data and dealing with this data can be expensive. Nakul’s has done the experiment here already but I am going to repeat the same again using AdventureWorks Database. Run following T-SQL Script with and without enabling the option to discard the results. USE AdventureWorks2012 GO SELECT * FROM Sales.SalesOrderDetail GO 10 After enabling Discard Results After Query Execution After disabling Discard Results After Query Execution Well, this is indeed a good option when someone is debugging the execution plan or does not want the result to be displayed. Please note that this option does not reduce IO or CPU usage for SQL Server. It just discards the results after execution and a good help for debugging on the development server. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Server Management Studio, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • WEB203 &ndash; Jump into Silverlight!&hellip; and Become Effective Immediately with Tim Huckaby, Fou

    - by Robert Burger
    Getting ready for the good stuff. Definitely wish there were more Silverlight and WCF RIA sessions, but this is a start.  Was lucky to get a coveted power-enabled seat.  Luckily, due to my trustily slow Verizon data card, I can get these notes out amidst a total Internet outage here.  This is the second breakout session of the day, and is by far standing-room only.  I stepped out before the session started to get a cool Diet COKE and wouldn’t have gotten back in if I didn’t already have a seat. Tim says this is an intro session and that he’s been begging for intro sessions at TechEd for years and that by looking at this audience, he thinks the demand is there.  Admittedly, I didn’t know this was an intro session, or I might have gone elsewhere.  But, it was the very first Silverlight session, so I had to be here. Tim says he will be providing a very good comprehensive reference application at the end of the presentation.  He has just demoed it, and it is a full CRUD-based Sales Manager application based on…  AdventureWorks! Session Agenda What it is / How to get started Declarative Programming Layout and Controls, Events and Commands Working with Data Adding Style to Your Application   Silverlight…  “WPF Light” Why is the download 4.2MB?  Because the direct competitor is a 4.2MB download.  There is no technical reason it is not the entire framework.  It is purely to “be competitive”.   Getting Started Get all of the following downloads from www.silverlight.net/getstarted Install VS2010 or Visual Web Developer Express 2010 Install Silverlight 4 Tools for VS2010 Install Expression Blend 4 Install the Silverlight 4 Toolkit   Reference Application Features Uses MVVM pattern – a way to move data access code that would normally be inline within the UI and placing it in nice data access libraries Images loaded dynamically from the database, converting GIF to PNG because Silverlight does not support GIF. LINQ to SQL is the data access model WCF is the data provider and is using binary message encoding   Declarative Programming XAML replaces code for UI representation Attributes control Layout and Style Event handlers wired-up in XAML Declarative Data Binding   Layout Overview Content rendering flows inside of parent Fixed positioning (Canvas) is seldom used Panels are used to house content Margins and Padding over fixed size   Panels StackPanel – Arranges child elements into a single line oriented horizontally or vertically Grid – A flexible grid are that consists of rows and columns Canvas – An are where positions are specifically fixed WrapPanel (in Toolkit) – Positions child elements in sequential position left to right and top to bottom. DockPanel (in Toolkit) – Positions child controls within a dockable area   Positioning Horizontal and Vertical Alignment Margin – Separates an element from neighboring elements Padding – Enlarges the effective size of an element by a thickness   Controls Overview Not all controls created equal Silverlight, as a subset of WPF, so many WPF controls do not exist in the core Siverlight release Silverlight Toolkit continues to add controls, but are released in different quality bands Plenty of good 3rd party controls to fill the gaps Windows Phone 7 is to have 95% of controls available in Silverlight Core and Toolkit.   Events and Commands Standard .NET Events Routed Events Commands – based on the ICommand interface – logical action that can be invoked in several ways   Adding Style to Your Application Resource Dictionaries – Contains a hash table of key/value pairs.  Silverlight can only use Static Resources whereas WPF can also use Dynamic Resources Visual State Manager Silverlight 4 supports Implicit styles ResourceDictionary.MergedDictionaries combines many different file-based resources   Downloads

    Read the article

  • Oracle Big Data Software Downloads

    - by Mike.Hallett(at)Oracle-BI&EPM
    Companies have been making business decisions for decades based on transactional data stored in relational databases. Beyond that critical data, is a potential treasure trove of less structured data: weblogs, social media, email, sensors, and photographs that can be mined for useful information. Oracle offers a broad integrated portfolio of products to help you acquire and organize these diverse data sources and analyze them alongside your existing data to find new insights and capitalize on hidden relationships. Oracle Big Data Connectors Downloads here, includes: Oracle SQL Connector for Hadoop Distributed File System Release 2.1.0 Oracle Loader for Hadoop Release 2.1.0 Oracle Data Integrator Companion 11g Oracle R Connector for Hadoop v 2.1 Oracle Big Data Documentation The Oracle Big Data solution offers an integrated portfolio of products to help you organize and analyze your diverse data sources alongside your existing data to find new insights and capitalize on hidden relationships. Oracle Big Data, Release 2.2.0 - E41604_01 zip (27.4 MB) Integrated Software and Big Data Connectors User's Guide HTML PDF Oracle Data Integrator (ODI) Application Adapter for Hadoop Apache Hadoop is designed to handle and process data that is typically from data sources that are non-relational and data volumes that are beyond what is handled by relational databases. Typical processing in Hadoop includes data validation and transformations that are programmed as MapReduce jobs. Designing and implementing a MapReduce job usually requires expert programming knowledge. However, when you use Oracle Data Integrator with the Application Adapter for Hadoop, you do not need to write MapReduce jobs. Oracle Data Integrator uses Hive and the Hive Query Language (HiveQL), a SQL-like language for implementing MapReduce jobs. Employing familiar and easy-to-use tools and pre-configured knowledge modules (KMs), the application adapter provides the following capabilities: Loading data into Hadoop from the local file system and HDFS Performing validation and transformation of data within Hadoop Loading processed data from Hadoop to an Oracle database for further processing and generating reports Oracle Database Loader for Hadoop Oracle Loader for Hadoop is an efficient and high-performance loader for fast movement of data from a Hadoop cluster into a table in an Oracle database. It pre-partitions the data if necessary and transforms it into a database-ready format. Oracle Loader for Hadoop is a Java MapReduce application that balances the data across reducers to help maximize performance. Oracle R Connector for Hadoop Oracle R Connector for Hadoop is a collection of R packages that provide: Interfaces to work with Hive tables, the Apache Hadoop compute infrastructure, the local R environment, and Oracle database tables Predictive analytic techniques, written in R or Java as Hadoop MapReduce jobs, that can be applied to data in HDFS files You install and load this package as you would any other R package. Using simple R functions, you can perform tasks such as: Access and transform HDFS data using a Hive-enabled transparency layer Use the R language for writing mappers and reducers Copy data between R memory, the local file system, HDFS, Hive, and Oracle databases Schedule R programs to execute as Hadoop MapReduce jobs and return the results to any of those locations Oracle SQL Connector for Hadoop Distributed File System Using Oracle SQL Connector for HDFS, you can use an Oracle Database to access and analyze data residing in Hadoop in these formats: Data Pump files in HDFS Delimited text files in HDFS Hive tables For other file formats, such as JSON files, you can stage the input in Hive tables before using Oracle SQL Connector for HDFS. Oracle SQL Connector for HDFS uses external tables to provide Oracle Database with read access to Hive tables, and to delimited text files and Data Pump files in HDFS. Related Documentation Cloudera's Distribution Including Apache Hadoop Library HTML Oracle R Enterprise HTML Oracle NoSQL Database HTML Recent Blog Posts Big Data Appliance vs. DIY Price Comparison Big Data: Architecture Overview Big Data: Achieve the Impossible in Real-Time Big Data: Vertical Behavioral Analytics Big Data: In-Memory MapReduce Flume and Hive for Log Analytics Building Workflows in Oozie

    Read the article

  • Removing old kernel entries in Grub

    - by To Do
    I regularly delete old kernels leaving only the latest two entries using Synaptic. I'm using Precise. However in my Grub "previous Linux version" menu there are quite a few entries labelled 2.6.8. I cannot find these linux-images in Synaptic. dpkg -l | grep linux-image Gives: rc linux-image-3.0.0-17-generic 3.0.0-17.30 Linux kernel image for version 3.0.0 on x86/x86_64 ii linux-image-3.2.0-27-generic 3.2.0-27.43 Linux kernel image for version 3.2.0 on 32 bit x86 SMP ii linux-image-3.2.0-29-generic 3.2.0-29.46 Linux kernel image for version 3.2.0 on 32 bit x86 SMP ii linux-image-3.4.0-030400-generic 3.4.0-030400.201205210521 Linux kernel image for version 3.4.0 on 32 bit x86 SMP ii linux-image-generic 3.2.0.29.31 Generic Linux kernel image Sudo update-grub gives: Generating grub.cfg ... Found linux image: /boot/vmlinuz-3.4.0-030400-generic Found initrd image: /boot/initrd.img-3.4.0-030400-generic Found linux image: /boot/vmlinuz-3.2.0-29-generic Found initrd image: /boot/initrd.img-3.2.0-29-generic Found linux image: /boot/vmlinuz-3.2.0-27-generic Found initrd image: /boot/initrd.img-3.2.0-27-generic Found linux image: /boot/vmlinuz-2.6.38-11-generic Found initrd image: /boot/initrd.img-2.6.38-11-generic Found linux image: /boot/vmlinuz-2.6.38-10-generic Found initrd image: /boot/initrd.img-2.6.38-10-generic Found linux image: /boot/vmlinuz-2.6.38-8-generic Found initrd image: /boot/initrd.img-2.6.38-8-generic Found memtest86+ image: /boot/memtest86+.bin Found Windows Vista (loader) on /dev/sda1 sudo apt-get remove linux-image-2.6.8-8-generic gives: E: Unable to locate package linux-image-2.6.8-8-generic E: Couldn't find any package by regex 'linux-image-2.6.8-8-generic' My boot folder contains the following: abi-2.6.38-10-generic initrd.img-3.4.0-030400-generic abi-2.6.38-11-generic memtest86+.bin abi-2.6.38-8-generic memtest86+_multiboot.bin abi-3.2.0-27-generic System.map-2.6.38-10-generic abi-3.2.0-29-generic System.map-2.6.38-11-generic abi-3.4.0-030400-generic System.map-2.6.38-8-generic config-2.6.38-10-generic System.map-3.2.0-27-generic config-2.6.38-11-generic System.map-3.2.0-29-generic config-2.6.38-8-generic System.map-3.4.0-030400-generic config-3.2.0-27-generic vmcoreinfo-2.6.38-10-generic config-3.2.0-29-generic vmcoreinfo-2.6.38-11-generic config-3.4.0-030400-generic vmcoreinfo-2.6.38-8-generic extlinux vmlinuz-2.6.38-10-generic grub vmlinuz-2.6.38-11-generic initrd.img-2.6.38-10-generic vmlinuz-2.6.38-8-generic initrd.img-2.6.38-11-generic vmlinuz-3.2.0-27-generic initrd.img-2.6.38-8-generic vmlinuz-3.2.0-29-generic initrd.img-3.2.0-27-generic vmlinuz-3.4.0-030400-generic initrd.img-3.2.0-29-generic and ls -l /etc/grub.d yields: total 56 -rwxr-xr-x 1 root root 6715 Apr 17 20:16 00_header -rwxr-xr-x 1 root root 5522 Oct 1 2011 05_debian_theme -rwxr-xr-x 1 root root 7407 May 17 09:22 10_linux -rwxr-xr-x 1 root root 6335 Apr 17 20:16 20_linux_xen -rwxr-xr-x 1 root root 1588 May 3 2011 20_memtest86+ -rwxr-xr-x 1 root root 7603 Apr 17 20:16 30_os-prober -rwxr-xr-x 1 root root 214 Oct 1 2011 40_custom -rwxr-xr-x 1 root root 95 Oct 1 2011 41_custom -rw-r--r-- 1 root root 483 Oct 1 2011 README gdisk -l /dev/sda yields: Partition table scan: MBR: MBR only BSD: not present APM: not present GPT: not present *************************************************************** Found invalid GPT and valid MBR; converting MBR to GPT format. *************************************************************** Disk /dev/sda: 312581808 sectors, 149.1 GiB Logical sector size: 512 bytes Disk identifier (GUID): F832A498-05E1-4615-B5B1-757ACB4A757A Partition table holds up to 128 entries First usable sector is 34, last usable sector is 312581774 Partitions will be aligned on 2048-sector boundaries Total free space is 4183661 sectors (2.0 GiB) Number Start (sector) End (sector) Size Code Name 1 2048 61442047 29.3 GiB 0700 Microsoft basic data 3 163842048 169986047 2.9 GiB 8200 Linux swap 4 169986048 312578047 68.0 GiB 0700 Microsoft basic data 5 61444096 159666175 46.8 GiB 8300 Linux filesystem Please help with removing the old and inexistent kernels from Grub.

    Read the article

  • Windows Azure Recipe: Mobile Computing

    - by Clint Edmonson
    A while back, mashups were all the rage. The idea was to compose solutions that provided aggregation and integration across applications and services to make information more available, useful, and personal. Mashups ushered in the era of Web 2.0 in all it’s socially connected goodness. They taught us that to be successful, we needed to add web service APIs to our web applications. Web and client based mashups met with great success and have evolved even further with the introduction of the internet connected smartphone. Nothing is more available, useful, or personal than our smartphones. The current generation of cloud connected mobile computing mashups allow our mobilized workforces to receive, process, and react to information from disparate sources faster than ever before. Drivers Integration Reach Time to market Solution Here’s a sketch of a prototypical mobile computing solution using Windows Azure: Ingredients Web Role – with the phone running a dedicated client application, the web role is responsible for serving up backend web services that implement the solution’s core connected functionality. Database – used to store core operational and workflow data for the solution’s web services. Access Control – this service is used to authenticate and manage users identity, roles, and groups, possibly in conjunction with 3rd identity providers such as Windows LiveID, Google, Yahoo!, and Facebook. Worker Role – this role is used to handle the orchestration of long-running, complex, asynchronous operations. While much of the integration and interaction with other services can be handled directly by the mobile client application, it’s possible that the backend may need to integrate with 3rd party services as well. Offloading this work to a worker role better distributes computing resources and keeps the web roles focused on direct client interaction. Queues – these provide reliable, persistent messaging between applications and processes. They are an absolute necessity once asynchronous processing is involved. Queues facilitate the flow of distributed events and allow a solution to send push notifications back to mobile devices at appropriate times. Training & Resources These links point to online Windows Azure training labs and resources where you can learn more about the individual ingredients described above. (Note: The entire Windows Azure Training Kit can also be downloaded for offline use.) Windows Azure (16 labs) Windows Azure is an internet-scale cloud computing and services platform hosted in Microsoft data centers, which provides an operating system and a set of developer services which can be used individually or together. It gives developers the choice to build web applications; applications running on connected devices, PCs, or servers; or hybrid solutions offering the best of both worlds. New or enhanced applications can be built using existing skills with the Visual Studio development environment and the .NET Framework. With its standards-based and interoperable approach, the services platform supports multiple internet protocols, including HTTP, REST, SOAP, and plain XML SQL Azure (7 labs) Microsoft SQL Azure delivers on the Microsoft Data Platform vision of extending the SQL Server capabilities to the cloud as web-based services, enabling you to store structured, semi-structured, and unstructured data. Windows Azure Services (9 labs) As applications collaborate across organizational boundaries, ensuring secure transactions across disparate security domains is crucial but difficult to implement. Windows Azure Services provides hosted authentication and access control using powerful, secure, standards-based infrastructure. Windows Azure Toolkit for Windows Phone The Windows Azure Toolkit for Windows Phone is designed to make it easier for you to build mobile applications that leverage cloud services running in Windows Azure. The toolkit includes Visual Studio project templates for Windows Phone and Windows Azure, class libraries optimized for use on the phone, sample applications, and documentation Windows Azure Toolkit for iOS The Windows Azure Toolkit for iOS is a toolkit for developers to make it easy to access Windows Azure storage services from native iOS applications. The toolkit can be used for both iPhone and iPad applications, developed using Objective-C and XCode. Windows Azure Toolkit for Android The Windows Azure Toolkit for Android is a toolkit for developers to make it easy to work with Windows Azure from native Android applications. The toolkit can be used for native Android applications developed using Eclipse and the Android SDK. See my Windows Azure Resource Guide for more guidance on how to get started, including links web portals, training kits, samples, and blogs related to Windows Azure.

    Read the article

  • career in Mobile sw/Application Development [closed]

    - by pramod
    i m planning to do a course on Wireless & mobile computing.The syllabus are given below.Please check & let me know whether its worth to do.How is the job prospects after that.I m a fresher & from electronic Engg.The modules are- *Wireless and Mobile Computing (WiMC) – Modules* C, C++ Programming and Data Structures 100 Hours C Revision C, C++ programming tools on linux(Vi editor, gdb etc.) OOP concepts Programming constructs Functions Access Specifiers Classes and Objects Overloading Inheritance Polymorphism Templates Data Structures in C++ Arrays, stacks, Queues, Linked Lists( Singly, Doubly, Circular) Trees, Threaded trees, AVL Trees Graphs, Sorting (bubble, Quick, Heap , Merge) System Development Methodology 18 Hours Software life cycle and various life cycle models Project Management Software: A Process Various Phases in s/w Development Risk Analysis and Management Software Quality Assurance Introduction to Coding Standards Software Project Management Testing Strategies and Tactics Project Management and Introduction to Risk Management Java Programming 110 Hours Data Types, Operators and Language Constructs Classes and Objects, Inner Classes and Inheritance Inheritance Interface and Package Exceptions Threads Java.lang Java.util Java.awt Java.io Java.applet Java.swing XML, XSL, DTD Java n/w programming Introduction to servlet Mobile and Wireless Technologies 30 Hours Basics of Wireless Technologies Cellular Communication: Single cell systems, multi-cell systems, frequency reuse, analog cellular systems, digital cellular systems GSM standard: Mobile Station, BTS, BSC, MSC, SMS sever, call processing and protocols CDMA standard: spread spectrum technologies, 2.5G and 3G Systems: HSCSD, GPRS, W-CDMA/UMTS,3GPP and international roaming, Multimedia services CDMA based cellular mobile communication systems Wireless Personal Area Networks: Bluetooth, IEEE 802.11a/b/g standards Mobile Handset Device Interfacing: Data Cables, IrDA, Bluetooth, Touch- Screen Interfacing Wireless Security, Telemetry Java Wireless Programming and Applications Development(J2ME) 100 Hours J2ME Architecture The CLDC and the KVM Tools and Development Process Classification of CLDC Target Devices CLDC Collections API CLDC Streams Model MIDlets MIDlet Lifecycle MIDP Programming MIDP Event Architecture High-Level Event Handling Low-Level Event Handling The CLDC Streams Model The CLDC Networking Package The MIDP Implementation Introduction to WAP, WML Script and XHTML Introduction to Multimedia Messaging Services (MMS) Symbian Programming 60 Hours Symbian OS basics Symbian OS services Symbian OS organization GUI approaches ROM building Debugging Hardware abstraction Base porting Symbian OS reference design porting File systems Overview of Symbian OS Development – DevKits, CustKits and SDKs CodeWarrior Tool Application & UI Development Client Server Framework ECOM STDLIB in Symbian iPhone Programming 80 Hours Introducing iPhone core specifications Understanding iPhone input and output Designing web pages for the iPhone Capturing iPhone events Introducing the webkit CSS transforms transitions and animations Using iUI for web apps Using Canvas for web apps Building web apps with Dashcode Writing Dashcode programs Debugging iPhone web pages SDK programming for web developers An introduction to object-oriented programming Introducing the iPhone OS Using Xcode and Interface builder Programming with the SDK Toolkit OS Concepts & Linux Programming 60 Hours Operating System Concepts What is an OS? Processes Scheduling & Synchronization Memory management Virtual Memory and Paging Linux Architecture Programming in Linux Linux Shell Programming Writing Device Drivers Configuring and Building GNU Cross-tool chain Configuring and Compiling Linux Virtual File System Porting Linux on Target Hardware WinCE.NET and Database Technology 80 Hours Execution Process in .NET Environment Language Interoperability Assemblies Need of C# Operators Namespaces & Assemblies Arrays Preprocessors Delegates and Events Boxing and Unboxing Regular Expression Collections Multithreading Programming Memory Management Exceptions Handling Win Forms Working with database ASP .NET Server Controls and client-side scripts ASP .NET Web Server Controls Validation Controls Principles of database management Need of RDBMS etc Client/Server Computing RDBMS Technologies Codd’s Rules Data Models Normalization Techniques ER Diagrams Data Flow Diagrams Database recovery & backup SQL Android Application 80 Hours Introduction of android Why develop for android Android SDK features Creating android activities Fundamental android UI design Intents, adapters, dialogs Android Technique for saving data Data base in Androids Maps, Geocoding, Location based services Toast, using alarms, Instant messaging Using blue tooth Using Telephony Introducing sensor manager Managing network and wi-fi connection Advanced androids development Linux kernel security Implement AIDL Interface. Project 120 Hours

    Read the article

  • Customize Entity Framework SSDL &amp; SQL Generation

    - by Dane Morgridge
    In almost every talk I have done on Entity Framework I get questions on how to do custom SSDL or SQL when using model first development.  Quite a few of these questions have required custom changes to the SSDL, which of course can be a problem if it is getting auto generated.  Luckily, there is a tool that can help.  In the Visual Studio Gallery on MSDN, there is the Entity Designer Database Generation Power Pack. You have the ability to select different generation strategies and it also allows you to inject custom T4 Templates into the generation workflow so that you can customize the SSDL and SQL generation.  When you select to generate a database from a model the dialog is replaced by one with more options:   You can clone the individual workflow for either the current project or current machine.  The templates are installed at “C:\Program Files (x86)\Microsoft Visual Studio 10.0\Common7\IDE\Extensions\Microsoft\Entity Framework Tools\DBGen” on my local machine and you can make a copy of any template there.  If you clone the strategy and open it up, you will get the following workflow: Each item in the sequence is defining the execution of a T4 template.  The XAML for the workflow is listed below so you can see where the T4 files are defined.  You can simply make a copy of an existing template and make what ever changes you need.   1: <Activity x:Class="GenerateDatabaseScriptWorkflow" ... > 2: <x:Members> 3: <x:Property Name="Csdl" Type="InArgument(sde:EdmItemCollection)" /> 4: <x:Property Name="ExistingSsdl" Type="InArgument(s:String)" /> 5: <x:Property Name="ExistingMsl" Type="InArgument(s:String)" /> 6: <x:Property Name="Ssdl" Type="OutArgument(s:String)" /> 7: <x:Property Name="Msl" Type="OutArgument(s:String)" /> 8: <x:Property Name="Ddl" Type="OutArgument(s:String)" /> 9: <x:Property Name="SmoSsdl" Type="OutArgument(ss:SsdlServer)" /> 10: </x:Members> 11: <Sequence> 12: <dbtk:ProgressBarStartActivity /> 13: <dbtk:CsdlToSsdlTemplateActivity SsdlOutput="[Ssdl]" TemplatePath="$(VSEFTools)\DBGen\CSDLToSSDL_TPT.tt" /> 14: <dbtk:CsdlToMslTemplateActivity MslOutput="[Msl]" TemplatePath="$(VSEFTools)\DBGen\CSDLToMSL_TPT.tt" /> 15: <ded:SsdlToDdlActivity ExistingSsdlInput="[ExistingSsdl]" SsdlInput="[Ssdl]" DdlOutput="[Ddl]" /> 16: <dbtk:GenerateAlterSqlActivity DdlInputOutput="[Ddl]" DeployToScript="True" DeployToDatabase="False" /> 17: <dbtk:ProgressBarEndActivity ClosePopup="true" /> 18: </Sequence> 19: </Activity>   So as you can see, this tool enables you to make some pretty heavy customizations to how the SSDL and SQL get generated.  You can get more info and the tool can be downloaded from: http://visualstudiogallery.msdn.microsoft.com/en-us/df3541c3-d833-4b65-b942-989e7ec74c87.  There is a comments section on the site so make sure you let the team know what you like and what you don’t like.  Enjoy!

    Read the article

  • Welcome 2011

    - by WeigeltRo
    Things that happened in 2010 MIX10 was absolutely fantastic. Read my report of MIX10 to see why.   The dotnet Cologne 2010, the community conference organized by the .NET user group Köln and my own group Bonn-to-Code.Net became an even bigger success than I dared to dream of.   There was a huge discrepancy between the efforts by Microsoft to support .NET user groups to organize public live streaming events of the PDC keynote (the dotnet Cologne team joined forces with netug  Niederrhein to organize the PDCologne) and the actual content of the keynote. The reaction of the audience at our event was “meh” and even worse I seriously doubt we’ll ever get that number of people to such an event (which on top of that suffered from technical difficulties beyond our control).   What definitely would have deserved the public live streaming event treatment was the Silverlight Firestarter (aka “Silverlight Damage Control”) event. And maybe we would have thought about organizing something if it weren’t for the “burned earth” left by the PDC keynote. Anyway, the stuff shown at the firestarter keynote was the topic of conversations among colleagues days later (“did you see that? oh yeah, that was seriously cool”). Things that I have learned/observed/noticed in 2010 In the long run, there’s a huge difference between “It works pretty well” and “it just works and I never have to think about it”. I had to get rid of my USB graphics adapter powering the third monitor (read about it in this blog post). Various small issues (desktop icons sometimes moving their positions after a reboot for no apparent reasons, at least one game I couldn’t get run at all, all three monitors sometimes simply refusing to wake up after standby) finally made me buy a PCIe 1x graphics adapter. If you’re interested: The combination of a NVIDIA GTX 460 and a GT 220 is running in “don’t make me think” mode for a couple of months now.   PowerPoint 2010 is a seriously cool piece of software. Not only the new hardware-accelerated effects, but also features like built-in background removal and picture processing (which in many cases are simply “good enough” and save a lot of time) or the smart guides.   Outlook 2010 crashes on me a lot. I haven’t been successful in reproducing these crashes, they just happen when every couple of days on different occasions (only thing in common: I clicked something in the main window – yeah, very helpful observation)   Visual Studio 2010 reminds me of Visual Studio 2005 before SP1, which is actually not a good thing to say about a piece of software. I think it’s telling that Microsoft’s message regarding the beta of SP1 has been different from earlier service pack betas (promising an upgrade path for a beta to the RTM sounds to me like “please, please use it NOW!”).   I have a love/hate relationship with ReSharper. I don’t want to develop without it, but at the same time I can’t fail to notice that ReSharper is taking a heavy toll in terms of performance and sometimes stability. Things I’m looking forward to in 2011 Obviously, the dotnet Cologne 2011. We already have been able to score some big name sponsors (Microsoft, Intel), but we’re still looking for more sponsors. And be assured that we’ll make sure that our partners get the most out of their contribution, regardless of how big or small.   MIX11, period.    Silverlight 5 is going to be great. The only thing I’m a bit nervous about is that I still haven’t read anything official on whether C# next version’s async/await will be in it. Leaving that out would be really stupid considering the end-of-2011 release of SL5 (moving the next release way into the future).

    Read the article

  • F# and the rose-tinted reflection

    - by CliveT
    We're already seeing increasing use of many cores on client desktops. It is a change that has been long predicted. It is not just a change in architecture, but our notions of efficiency in a program. No longer can we focus on the asymptotic complexity of an algorithm by counting the steps that a single core processor would take to execute it. Instead we'll soon be more concerned about the scalability of the algorithm and how well we can increase the performance as we increase the number of cores. This may even lead us to throw away our most efficient algorithms, and switch to less efficient algorithms that scale better. We might even be willing to waste cycles in order to speculatively execute at the algorithm rather than the hardware level. State is the big headache in this parallel world. At the hardware level, main memory doesn't necessarily contain the definitive value corresponding to a particular address. An update to a location might still be held in a CPU's local cache and it might be some time before the value gets propagated. To get the latest value, and the notion of "latest" takes a lot of defining in this world of rapidly mutating state, the CPUs may well need to communicate to decide who has the definitive value of a particular address in order to avoid lost updates. At the user program level, this means programmers will need to lock objects before modifying them, or attempt to avoid the overhead of locking by understanding the memory models at a very deep level. I think it's this need to avoid statefulness that has led to the recent resurgence of interest in functional languages. In the 1980s, functional languages started getting traction when research was carried out into how programs in such languages could be auto-parallelised. Sadly, the impracticality of some of the languages, the overheads of communication during this parallel execution, and rapid improvements in compiler technology on stock hardware meant that the functional languages fell by the wayside. The one thing that these languages were good at was getting rid of implicit state, and this single idea seems like a solution to the problems we are going to face in the coming years. Whether these languages will catch on is hard to predict. The mindset for writing a program in a functional language is really very different from the way that object-oriented problem decomposition happens - one has to focus on the verbs instead of the nouns, which takes some getting used to. There are a number of hybrid functional/object languages that have been becoming more popular in recent times. These half-way houses make it easy to use functional ideas for some parts of the program while still allowing access to the underlying object-focused platform without a great deal of impedance mismatch. One example is F# running on the CLR which, in Visual Studio 2010, has because a first class member of the pack. Inside Visual Studio 2010, the tooling for F# has improved to the point where it is easy to set breakpoints and watch values change while debugging at the source level. In my opinion, it is the tooling support that will enable the widespread adoption of functional languages - without this support, people will put off any transition into the functional world for as long as they possibly can. Without tool support it will make it hard to learn these languages. One tool that doesn't currently support F# is Reflector. The idea of decompiling IL to a functional language is daunting, but F# is potentially so important I couldn't dismiss the idea. As I'm currently developing Reflector 6.5, I thought it wise to take four days just to see how far I could get in doing so, even if it achieved little more than to be clearer on how much was possible, and how long it might take. You can read what happened here, and of the insights it gave us on ways to improve the tool.

    Read the article

  • Ubuntu 12.04 // Likewise Open // Unable to ever authenticate AD users

    - by Rob
    So Ubuntu 12.04, Likewise latest from the beyondtrust website. Joins domain fine. Gets proper information from lw-get-status. Can use lw-find-user-by-name to retrieve/locate users. Can use lw-enum-users to get all users. Attempting to login with an AD user via SSH generates the following errors in the auth.log file: Nov 28 19:15:45 hostname sshd[2745]: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:15:45 hostname sshd[2745]: PAM adding faulty module: pam_winbind.so Nov 28 19:15:51 hostname sshd[2745]: error: PAM: Authentication service cannot retrieve authentication info for DOMAIN\\user.name from remote.hostname Nov 28 19:16:06 hostname sshd[2745]: Connection closed by 10.1.1.84 [preauth] Attempting to login via the LightDM itself generates similar errors in the auth.log file. Nov 28 19:19:29 hostname lightdm: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:19:29 hostname lightdm: PAM adding faulty module: pam_winbind.so Nov 28 19:19:47 hostname lightdm: pam_succeed_if(lightdm:auth): requirement "user ingroup nopasswdlogin" not met by user "DOMAIN\user.name" Nov 28 19:19:52 hostname lightdm: [lsass-pam] [module:pam_lsass]pam_sm_authenticate error [login:DOMAIN\user.name][error code:40022] Nov 28 19:19:54 hostname lightdm: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:19:54 hostname lightdm: PAM adding faulty module: pam_winbind.so Attempting to login via a console on the system itself generates slightly different errors: Nov 28 19:31:09 hostname login[997]: PAM unable to dlopen(pam_winbind.so): /lib/security/pam_winbind.so: cannot open shared object file: No such file or directory Nov 28 19:31:09 hostname login[997]: PAM adding faulty module: pam_winbind.so Nov 28 19:31:11 hostname login[997]: [lsass-pam] [module:pam_lsass]pam_sm_authenticate error [login:DOMAIN\user.name][error code:40022] Nov 28 19:31:14 hostname login[997]: FAILED LOGIN (1) on '/dev/tty2' FOR 'DOMAIN\user.name', Authentication service cannot retrieve authentication info Nov 28 19:31:31 hostname login[997]: FAILED LOGIN (2) on '/dev/tty2' FOR 'DOMAIN\user.name', Authentication service cannot retrieve authentication info I am baffled. The errors obviously are correct, the file /lib/security/pam_winbind.so does not exist. If its a dependancy/required, surely it should be part of the package? I've installed/reinstalled, I've used the downloaded package from the beyondtrust website, i've used the repository, nothing seems to work, every method of installing this application generates the same errors for me. UPDATE : Hrmm, I thought likewise didn't use native winbind but its own modules. Installing winbind from apt-get uninstalls pbis-open (likewise) and generates failures when installing if pbis-open is installed first. Uninstalled winbind, reinstalled pbis-open, same issue as above. The file pam_winbind.so does not exist in that location. Setting up pbis-open-legacy (7.0.1.918) ... Installing Packages was successful This computer is joined to DOMAIN.LOCAL New libraries and configurations have been installed for PAM and NSS. Clearly it thinks it has installed it, but it hasn't. It may be a legacy issue with the previous attempt to configure domain integration manually with winbind. Does anyone have a working likewise-open installation and does the /etc/nsswitch.conf include references to winbind? Or do the /etc/pam.d/common-account or /etc/pam.d/common-password reference pam_winbind.so? I'm unsure if those entries are just legacy or setup by likewise. UPDATE 2 : Complete reinstall of OS fixed it and it worked seamlessly, like it was meant to and those 2 PAM files did NOT include entries for pam_winbind.so, so that was the underlying problem. Thanks for the assist.

    Read the article

  • Multidimensional Thinking–24 Hours of Pass: Celebrating Women in Technology

    - by smisner
    It’s Day 1 of #24HOP and it’s been great to participate in this event with so many women from all over the world in one long training-fest. The SQL community has been abuzz on Twitter with running commentary which is fun to watch while listening to the current speaker. If you missed the fun today because you’re busy with all that work you’ve got to do – don’t despair. All sessions are recorded and will be available soon. Keep an eye on the 24 Hours of Pass page for details. And the fun’s not over today. Rather than run 24 hours consecutively, #24HOP is now broken down into 12-hours over two days, so check out the schedule to see if there’s a session that interests you and fits your schedule. I’m pleased to announce that my business colleague Erika Bakse ( Blog | Twitter) will be presenting on Day 2 – her debut presentation for a PASS event. (And I’m also pleased to say she’s my daughter!) Multidimensional Thinking: The Presentation My contribution to this lineup of terrific speakers was Multidimensional Thinking. Here’s the abstract: “Whether you’re developing Analysis Services cubes or creating PowerPivot workbooks, you need to get into a multidimensional frame of mind to produce a model that best enables users to answer their business questions on their own. Many database professionals struggle initially with multidimensional models because the data modeling process is much different than the one they use to produce traditional, third normal form databases. In this session, I’ll introduce you to the terminology of multidimensional modeling and step through the process of translating business requirements into a viable model.” If you watched the presentation and want a copy of the slides, you can download a copy here. And you’re welcome to download the slides even if you didn’t watch the presentation, but they’ll make more sense if you did! Kimball All the Way There’s only so much I can cover in the time allotted, but I hope that I succeeded in my attempt to build a foundation that prepares you for starting out in business intelligence. One of my favorite resources that will get into much more detail about all kinds of scenarios (well beyond the basics!) is The Data Warehouse Toolkit (Second Edition) by Ralph Kimball. Anything from Kimball or the Kimball Group is worth reading. Kimball material might take reading and re-reading a few times before it makes sense. From my own experience, I found that I actually had to just build my first data warehouse using dimensional modeling on faith that I was going the right direction because it just didn’t click with me initially. I’ve had years of practice since then and I can say it does get easier with practice. The most important thing, in my opinion, is that you simply must prototype a lot and solicit user feedback, because ultimately the model needs to make sense to them. They will definitely make sure you get it right! Schema Generation One question came up after the presentation about whether we use SQL Server Management Studio or Business Intelligence Development Studio (BIDS) to build the tables for the dimensional model. My answer? It really doesn’t matter how you create the tables. Use whatever method that you’re comfortable with. But just so happens that it IS possible to set up your design in BIDS as part of an Analysis Services project and to have BIDS generate the relational schema for you. I did a Webcast last year called Building a Data Mart with Integration Services that demonstrated how to do this. Yes, the subject was Integration Services, but as part of that presentation, I showed how to leverage Analysis Services to build the tables, and then I showed how to use Integration Services to load those tables. I blogged about this presentation in September 2010 and included downloads of the project that I used. In the blog post, I explained that I missed a step in the demonstration. Oops. Just as an FYI, there were two more Webcasts to finish the story begun with the data – Accelerating Answers with Analysis Services and Delivering Information with Reporting Services. If you want to just cut to the chase and learn how to use Analysis Services to build the tables, you can see the Using the Schema Generation Wizard topic in Books Online.

    Read the article

  • Thread safe double buffering

    - by kdavis8
    I am trying to implement a draw map method that will draw the tiled image across the surface of the component. I'm having issue with this code. The double buffering does not seem to be working, because the sprite flickers like crazy; my source code: package myPackage; import java.awt.Color; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.Image; import java.awt.Toolkit; import java.awt.image.BufferStrategy; import java.awt.image.BufferedImage; import javax.swing.JFrame; public class GameView extends JFrame implements Runnable { public BufferedImage backbuffer; public Graphics2D g2d; public Image img; Thread gameloop; Scene scene; public GameView() { super("Game View"); setSize(600, 600); setVisible(true); setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); backbuffer = new BufferedImage(getWidth(), getHeight(), BufferedImage.TYPE_INT_RGB); g2d = backbuffer.createGraphics(); Toolkit tk = Toolkit.getDefaultToolkit(); img = tk.getImage(this.getClass().getResource("cage.png")); scene = new Scene(g2d, this); gameloop = new Thread(this); gameloop.start(); } public static void main(String args[]) { new GameView(); } public void paint(Graphics g) { g.drawImage(backbuffer, 0, 0, this); repaint(); } @Override public void run() { // TODO Auto-generated method stub Thread t = Thread.currentThread(); while (t == gameloop) { scene.getScene("dirtmap"); g2d.drawImage(img, 80, 80,this![enter image description here][1]); } } private void drawScene(String string) { // TODO Auto-generated method stub // g2d.setColor(Color.white); // g2d.fillRect(0, 0, getWidth(), getHeight()); scene.getScene(string); } } package myPackage; import java.awt.Color; import java.awt.Component; import java.awt.Graphics; import java.awt.Graphics2D; import java.awt.Image; import java.awt.Toolkit; public class Scene { Graphics g2d; Component c; boolean loaded = false; public Scene(Graphics2D gr, Component co) { g2d = gr; c = co; } public void getScene(String mapName) { Toolkit tk = Toolkit.getDefaultToolkit(); Image tile = tk.getImage(this.getClass().getResource("dirt.png")); // g2d.setColor(Color.red); for (int y = 0; y <= 18; y++) { for (int x = 0; x <= 18; x += 1) { g2d.drawImage(tile, x * 32, y * 32, c); } } loaded = true; } }

    Read the article

  • Bash completion doesn't work, or is ignoring what I've typed; but works for commands

    - by Neil Traft
    Bash completion seems to be ignoring what I've typed (it tries to complete, but acts as if there's nothing under the cursor). I know I saw it work on this machine earlier today, but I'm not sure what has changed. Some examples: cd shows all directories under my current folder: $ cd co<tab><tab> cmake/ config/ doc/ examples/ include/ programs/ sandbox/ src/ .svn/ tests/ Commands like ls and less show all files and directories under my current folder: $ ls co<tab><tab> cmake/ config/ .cproject Doxyfile.in include/ programs/ README.txt src/ tests/ CMakeLists.txt COPYING.txt doc/ examples/ mainpage.dox .project sandbox/ .svn/ Even when I try to complete things from a different folder, it gives me only the results for my current folder (telling me that it is completely ignoring what I've typed): $ cd ~/D<tab><tab> cmake/ config/ doc/ examples/ include/ programs/ sandbox/ src/ .svn/ tests/ But it seems to be working fine for commands and variables: $ if<tab><tab> if ifconfig ifdown ifnames ifquery ifup $ echo $P<tab><tab> $PATH $PIPESTATUS $PPID $PS1 $PS2 $PS4 $PWD $PYTHONPATH I do have this bit in my .bashrc, and I have confirmed that my .bashrc is indeed getting sourced: if [ -f /etc/bash_completion ] && ! shopt -oq posix; then . /etc/bash_completion fi I've even tried manually executing that file, but it doesn't fix the problem: $ . /etc/bash_completion There was even one point in time where it was working for ls, but was not working for cd ... but I can't replicate that result now. Update: I also just discovered that I have terminals open from earlier that still work. I ran source .bashrc in one of them and afterwards completion was broken. Here is my .bashrc: # ~/.bashrc: executed by bash(1) for non-login shells. # see /usr/share/doc/bash/examples/startup-files (in the package bash-doc) # for examples # # Modified by Neil Traft #source ~/.profile # Allow globs to expand hidden files shopt -s dotglob nullglob # If not running interactively, don't do anything [ -z "$PS1" ] && return # don't put duplicate lines or lines starting with space in the history. # See bash(1) for more options HISTCONTROL=ignoreboth # append to the history file, don't overwrite it shopt -s histappend # for setting history length see HISTSIZE and HISTFILESIZE in bash(1) HISTSIZE=1000 HISTFILESIZE=2000 # check the window size after each command and, if necessary, # update the values of LINES and COLUMNS. shopt -s checkwinsize # If set, the pattern "**" used in a pathname expansion context will # match all files and zero or more directories and subdirectories. #shopt -s globstar # make less more friendly for non-text input files, see lesspipe(1) [ -x /usr/bin/lesspipe ] && eval "$(SHELL=/bin/sh lesspipe)" # set variable identifying the chroot you work in (used in the prompt below) if [ -z "$debian_chroot" ] && [ -r /etc/debian_chroot ]; then debian_chroot=$(cat /etc/debian_chroot) fi # Color the prompt export PS1="\[$(tput setaf 2)\]\u@\h:\[$(tput setaf 5)\]\W\[$(tput setaf 2)\] $\[$(tput sgr0)\] " # enable color support of ls and also add handy aliases if [ -x /usr/bin/dircolors ]; then test -r ~/.dircolors && eval "$(dircolors -b ~/.dircolors)" || eval "$(dircolors -b)" alias ls='ls --color=auto' #alias dir='dir --color=auto' #alias vdir='vdir --color=auto' alias grep='grep --color=auto' alias fgrep='fgrep --color=auto' alias egrep='egrep --color=auto' fi # Add an "alert" alias for long running commands. Use like so: # sleep 10; alert alias alert='notify-send --urgency=low -i "$([ $? = 0 ] && echo terminal || echo error)" "$(history|tail -n1|sed -e '\''s/^\s*[0-9]\+\s*//;s/[;&|]\s*alert$//'\'')"' # Alias definitions. # You may want to put all your additions into a separate file like # ~/.bash_aliases, instead of adding them here directly. # See /usr/share/doc/bash-doc/examples in the bash-doc package. if [ -f ~/.bash_aliases ]; then . ~/.bash_aliases fi # enable programmable completion features (you don't need to enable # this, if it's already enabled in /etc/bash.bashrc and /etc/profile # sources /etc/bash.bashrc). if [ -f /etc/bash_completion ] && ! shopt -oq posix; then . /etc/bash_completion fi

    Read the article

  • Why you need to tag your build servers in TFS

    - by Martin Hinshelwood
    At SSW we use gated check-in for all of our projects. The benefits are based on the number of developers you have working on your project. Lets say you have 30 developers and each developer breaks the build once per month. That could mean that you have a broken build every day! Gated check-ins help, but they have a down side that manifests as queued builds and moaning developers. The way to combat this is to have more build servers, but with that comes complexity. Inevitably you will need to install components that you would expect to be installed on target computers, but how do you keep track of which build servers have which bits? What about a geographically diverse team? If you have a centrally controlled infrastructure you might have build servers in multiple regions and you don’t want teams in Sydney copying files from Beijing and vice a versa on a regular basis. So, what is the answer. Its Tags. You can add a set of Tags to your agents and then set which tags to look for in the build definition. Figure: Open up your Build Controller Manager Select “Build | Manage Build Controllers…” to get a list of all of your controllers and he build agents that are associated with them. Figure: the list of build agents and their controllers Each of these Agents might be subtly different. For example only one of these agents has FTP software installed. This software is required for only one of the many builds we have set up. My ethos for build servers is to keep them as clean as possible and not to install anything that is not absolutely necessary. For me that means anything that does not add a *.target file is suspect, and should really be under version control and called via the command line from there. So, some of the things you may install are: Silverlight 4 SDK Visual Studio 2010 Visual Studio 2008 WIX etc You should not install things that will not end up on the target users computer. For a website that means something different to a client than to a server, but I am sure you get the idea. One thing you can do to make things easier is to create a tag for each of the things that you install. that way developers can find the things they need. We may change to using a more generic tagging structure (Like “Web Application” or “WinForms Application”) if this gets too unwieldy, but for now the list of tags is limited. Figure: Tags associated with one of our build agents Once you have your Build Agents all tagged up ALL your builds will start to fail This is because the default setting for a build is to look for an Agent that exactly matches the tags for the build, and we have not added any yet. The quick way to fix this is to change the “Tag Comparison Operator” from “ExactMatch” to “MatchAtLease” to get your build immediately working. Figure: Tag Comparison Operator changes to MatchAtLeast to get builds to run. The next thing to do is look for specific tags. You just select from the list of available tags and the controller will make sure you get to a build agent that uses them. Figure: I want Silverlight, VS2010 and WIX, but do not care about Location. And there you go, you can now have build agents for different purposes and regions within the same environment. You can also use name filtering, so if you have a good Agent naming convention you can filter by that for regions. For example, your Agents might be “SYDVMAPTFSBP01” and “SYDVMAPTFSBP02” so a name filter of “SYD*” would target all of the Sydney build agents. Figure: Agent names can be used for filtering as well This flexibility will allow you to build better software by reducing the likelihood of not having a certain dependency on the target machines. Figure: Setting the name filter based on server location  Used in combination there is a lot of power here to coordinate tens of build servers for multiple projects across multiple regions so your developers get the most out of your environment. Technorati Tags: ALM,TFBS,TFS 2010,TFS Admin

    Read the article

  • Silverlight Cream for May 11, 2010 -- #859

    - by Dave Campbell
    In this All Submittal Issue: Colin Eberhardt, Ken Johnson, Alan Beasley, Pencho Popadiyn, Phil Middlemiss, Khawar(-2-), Levente Mihály, Alex van Beek, Bart Czernicki, Michael Washington, and Mark Monster. Shoutout: Not Silverlight necessarily, but definitely VS2010, read what Brett Balmer has to say In Defense of Portrait Mode From SilverlightCream.com: Silverlight MultiBinding solution for Silverlight 4 Colin Eberhardt updated his Silverlight Multibinding solution to Silverlight 4. Great article with explanatory graphics, and links to the code... congrats on the use in the FaceBook Client too! Spirograph Shapes: WPF Bezier shapes from math formulae Wow... I haven't seen this much math since my Master's Thesis! ... Check out all the shapes Ken Johnson has built... don't let the math scare you... just use it :) Busy Dizzy Bee-sley Spirographic Animation in Expression Blend and Silverlight This is just fun... I saw Michael Washington playing with this yesterday at the Arizona Day of .NET but didn't have a chance to ask what it was.. Alan Beasley had a good time building this, and is sharing a very detailed tutorial with us. ModalDialogs, IEditableObject and MVVM in Silverlight 4 Pencho Popadiyn said the 'M' word over at SilverlightShow... actually the 'MVVM' word :) ... he's discussing Modal dialogs with no code in the View ... check out how he did it. A Chrome and Glass Theme - Part 6 Phil Middlemiss is up to episode 6 in his Theme-building tutorial... this time out, he's giving the TabControl and TabItem new clothes ... specifically discussing what to change and what to allow to inherit ... good stuff! Silverlight 4 Fonts gotcha Check out Khawar's ATM Machine demo -- there's a link on the page for this post... he had an issue with fonts, ratted it out, and explains it for all of us... thanks Khawar Demystifying Silverlight Obfuscation Khawar also has a good post up on Obfuscating your Silverlight... definitely showing that it's not all that difficult to do. geoGallery, a WinPhone7 sample OK this is interesting... using the geoLocation feature of WP7, Levente Mihály hits Google Picasa to find pictures... good write-up and all the code. Silverlight 4: Digitally signing a XAP with Visual Studio 2010 Alex van Beek has a nice tutorial on Signing your XAP file using Visual Studio 2010... of course you may want to visit Tim Heuer's blog (search at SC) to find the two good deals on certificates that are still in play. Creating Key Performance Indicators (KPIs) in Expression Blend 4 for Business Intelligence applications In an interesting post, Bart Czernicki describes using the shape assets in Blend 4 to produce a KPI display in Silverlight or WPF. A discussion of the shape's evolution for KPI is included as well as some alternate shape uses. A DotNetNuke Silverlight 4 Drag and Drop File Manager Michael Washington has blogged about his Drag and Drop File Manager using the View Model Style pattern. This is covered in two CodeProject articles listed in the post. The design work was done by Alan Beasely and links to his work is there as well as covered in other SC posts. How to select a ListItem on Hover Mark Monster had a Use Case for Selecting a ListBox entry by hovering ... but he did it with a Behavior and for a ListBox and PathListBox and it works with DataBinding... Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Windows Azure Emulators On Your Desktop

    - by BuckWoody
    Many people feel they have to set up a full Azure subscription online to try out and develop on Windows Azure. But you don’t have to do that right away. In fact, you can download the Windows Azure Compute Emulator – a “cloud development environment” – right on your desktop. No, it’s not for production use, and no, you won’t have other people using your system as a cloud provider, and yes, there are some differences with Production Windows Azure, but you’ll be able code, run, test, diagnose, watch, change and configure code without having any connection to the Internet at all. The best thing about this approach is that when you are ready to deploy the code you’ve been testing, a few clicks deploys it to your subscription when you make one.   So what deep-magic does it take to run such a thing right on your laptop or even a Virtual PC? Well, it’s actually not all that difficult. You simply download and install the Windows Azure SDK (you can even get a free version of Visual Studio for it to run on – you’re welcome) from here: http://msdn.microsoft.com/en-us/windowsazure/cc974146.aspx   This SDK will also install the Windows Azure Compute Emulator and the Windows Azure Storage Emulator – and then you’re all set. Right-click the icon for Visual Studio and select “Run as Administrator”:    Now open a new “Cloud” type of project:   Add your Web and Worker Roles that you want to code:   And when you’re done with your design, press F5 to start the desktop version of Azure:   Want to learn more about what’s happening underneath? Right-click the tray icon with the Azure logo, and select the two emulators to see what they are doing:          In the configuration files, you’ll see a “Use Development Storage” setting. You can call the BLOB, Table or Queue storage and it will all run on your desktop. When you’re ready to deploy everything to Windows Azure, you simply change the configuration settings and add the storage keys and so on that you need.   Want to learn more about all this?   Overview of the Windows Azure Compute Emulator: http://msdn.microsoft.com/en-us/library/gg432968.aspx Overview of the Windows Azure Storage Emulator: http://msdn.microsoft.com/en-us/library/gg432983.aspx January 2011 Training Kit: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=413E88F8-5966-4A83-B309-53B7B77EDF78&displaylang=en      

    Read the article

  • maintaining a growing, diverse codebase with continuous integration

    - by Nate
    I am in need of some help with philosophy and design of a continuous integration setup. Our current CI setup uses buildbot. When I started out designing it, I inherited (well, not strictly, as I was involved in its design a year earlier) a bespoke CI builder that was tailored to run the entire build at once, overnight. After a while, we decided that this was insufficient, and started exploring different CI frameworks, eventually choosing buildbot. One of my goals in transitioning to buildbot (besides getting to enjoy all the whiz-bang extras) was to overcome some of the inadequacies of our bespoke nightly builder. Humor me for a moment, and let me explain what I have inherited. The codebase for my company is almost 150 unique c++ Windows applications, each of which has dependencies on one or more of a dozen internal libraries (and many on 3rd party libraries as well). Some of these libraries are interdependent, and have depending applications that (while they have nothing to do with each other) have to be built with the same build of that library. Half of these applications and libraries are considered "legacy" and unportable, and must be built with several distinct configurations of the IBM compiler (for which I have written unique subclasses of Compile), and the other half are built with visual studio. The code for each compiler is stored in two separate Visual SourceSafe repositories (which I am simply handling using a bunch of ShellCommands, as there is no support for VSS). Our original nightly builder simply took down the source for everything, and built stuff in a certain order. There was no way to build only a single application, or pick a revision, or to group things. It would launched virtual machines to build a number of the applications. It wasn't very robust, it wasn't distributable. It wasn't terribly extensible. I wanted to be able to overcame all of these limitations in buildbot. The way I did this originally was to create entries for each of the applications we wanted to build (all 150ish of them), then create triggered schedulers that could build various applications as groups, and then subsume those groups under an overall nightly build scheduler. These could run on dedicated slaves (no more virtual machine chicanery), and if I wanted I could simply add new slaves. Now, if we want to do a full build out of schedule, it's one click, but we can also build just one application should we so desire. There are four weaknesses of this approach, however. One is our source tree's complex web of dependencies. In order to simplify config maintenace, all builders are generated from a large dictionary. The dependencies are retrieved and built in a not-terribly robust fashion (namely, keying off of certain things in my build-target dictionary). The second is that each build has between 15 and 21 build steps, which is hard to browse and look at in the web interface, and since there are around 150 columns, takes forever to load (think from 30 seconds to multiple minutes). Thirdly, we no longer have autodiscovery of build targets (although, as much as one of my coworkers harps on me about this, I don't see what it got us in the first place). Finally, aformentioned coworker likes to constantly bring up the fact that we can no longer perform a full build on our local machine (though I never saw what that got us, either, considering that it took three times as long as the distributed build; I think he is just paranoically phobic of ever breaking the build). Now, moving to new development, we are starting to use g++ and subversion (not porting the old repository, mind you - just for the new stuff). Also, we are starting to do more unit testing ("more" might give the wrong picture... it's more like any), and integration testing (using python). I'm having a hard time figuring out how to fit these into my existing configuration. So, where have I gone wrong philosophically here? How can I best proceed forward (with buildbot - it's the only piece of the puzzle I have license to work on) so that my configuration is actually maintainable? How do I address some of my design's weaknesses? What really works in terms of CI strategies for large, (possibly over-)complex codebases?

    Read the article

< Previous Page | 533 534 535 536 537 538 539 540 541 542 543 544  | Next Page >