Search Results

Search found 50874 results on 2035 pages for 'change script'.

Page 684/2035 | < Previous Page | 680 681 682 683 684 685 686 687 688 689 690 691  | Next Page >

  • WP7 “Phantom Data” Source Possibly Revealed?

    - by Bil Simser
    Recently there’s been rumours floating around regarding “phantom” Windows Phone 7 data being magically sent and received on the latest WP7 phones. The news has mostly been floating around twitter so I didn’t pay it much attention. The BBC Technology News picked it up so I thought I would look more into it myself seeing that we have WP7 phones and maybe there was some truth to all this (and more importantly what was the cause). Full disclosure. I don’t have a lot of data points around this. This is from looking at a few phone logs, changing the configuration and looking back again after the change. I haven’t done a clean baseline test nor have I done testing with hundreds of phones. I leave the experience up to the reader to decide. So I went spelunking into the phone logs to see what was up. Most providers will show you data usage, at least on a daily basis. I lucked out with the provider and plan in that they provide hourly breakdowns. Here’s a snapshot from my usage throughout one night. Timestamp Data Usage 12:38:30 AM 2098 Kilobytes 1:30:30 AM 2 Kilobytes 2:38:30 AM 7118 Kilobytes 3:38:30 AM 6622 Kilobytes 4:38:30 AM 76 Kilobytes 5:38:30 AM 29 Kilobytes 6:38:30 AM 19 Kilobytes 7:38:30 AM 20 Kilobytes So a few observations from this data: Data seems to be collected on a regular basis. Looking at some other people phone logs, the times vary but it’s always hourly. There’s not a tremendous amount of data here (about 16 megabytes) but it seems like a lot for 7 hours The phone was connected to my home Wifi during this period Nothing was running and the phone was in a locked state Like I said, not a lot of data but it adds up. 16MB for 7 hours = about 50MB in a 24 hour period. That’s just plain old data being collected (somewhere, somehow) and not actual usage (Marketplace, Email, Browsing, etc.). Besides, when connected to a WiFi network you shouldn’t be charged data usage from your phone company (in theory, YMMV). After reviewing the logs I made a theory that the only thing that could possibly be sending data is the Feedback feature. With no other apps running under lock, what else could it be? In Windows 7 under your Settings the last option is Feedback. This sends feedback to Microsoft to “help improve Windows Phone”. On this page you have three options: Send feedback and use my cellular data connection Send feedback and (presumably) use my WiFi connection Don’t send feedback Knowing what I know about Microsoft, they do use the feedback data. For example some of the placement and inclusion of features in Office 2007 was based on that Feedback data that Office sends (assuming you had opted in). However in the Privacy Statement (it’s long but a good read at least once in your life), the Phone manual, and every other source I could look at there is no information about how much data it’s planning to send, just that it’s sending some data and that “some data charges with your carrier may apply”. Looking back at the logs, I have to wonder. 6MB at 3:30 and *then* 7MB the next hour. That’s a lot of information. And it adds up. 50MB in a 24 hour period X 30 days puts most people over a normal 1GB plan. And frankly why am I paying for a data plan only to have 80% of it chewed up by Microsoft, with no real benefit to me. If they included porn in the 50mb daily transfer I’d be okay with this, but I don’t see any new movies on my phone. So I turned it off. Set Feedback to disabled and wait. I waited. And waited. And generally didn’t use the phone if I could. The next day I went back to look at the data usage logs from the time period after turning the feedback mechanism off. Here are the results. Timestamp Data Usage 1:19:48 PM 0 Kilobytes 2:19:48 PM 0 Kilobytes 3:19:48 PM 0 Kilobytes 4:19:48 PM 678 Kilobytes (took a phone call) 5:19:48 PM 82 Kilobytes 6:19:48 PM 88 Kilobytes 7:20:30 PM 86 Kilobytes (guess they changed their reporting time) 8:20:30 PM 86 Kilobytes 9:20:30 PM 66 Kilobytes 10:20:30 PM 67 Kilobytes 11:20:30 PM 49 Kilobytes 12:20:30 AM 32 Kilobytes 1:20:30 AM 38 Kilobytes 2:20:31 AM 18 Kilobytes 3:20:31 AM 27 Kilobytes 4:20:31 AM 86 Kilobytes 5:20:31 AM 53 Kilobytes 6:20:31 AM 22 Kilobytes 7:22:15 AM 30 Kilobytes (another reporting time change) 8:22:15 AM 29 Kilobytes 9:22:15 AM 74 Kilobytes 10:22:15 AM 154 Kilobytes (phone call) 11:22:15 AM 12 Kilobytes 12:13:27 PM 49 Kilobytes 1:13:27 PM 197 Kilobytes (phone call) Quite a *drastic* change from what Feedback was turned on. I mean for a 24 hour period (sans 3 phone calls) I consumed about 1MB. Still quite a bit of transfer going on but at least it amounts to 30MB per month, not 30MB per day! Like I said this observation is neither scientific or conclusive. You decide what to do but frankly until Microsoft makes this data transfer exempt from your data plan (like that will happen) I would just turn Feedback off. YMMV.

    Read the article

  • MySQL Connector/Net 6.4.6 Maintenance Release has been released

    - by fernando
    MySQL Connector/Net 6.4.6, a new version of the all-managed .NET driver for MySQL has been released.  This is a maintenance release and is recommended for use in production environments. It is appropriate for use with MySQL server versions 5.0-5.6. This is intended to be the final release for Connector/NET 6.4. It is now available in source and binary form from http://dev.mysql.com/downloads/connector/net/#downloads and mirror sites (note that not all mirror sites may be up to date at this point-if you can't find this version on some mirror, please try again later or choose another download site.) The 6.4.6 version of MySQL Connector/Net brings the following fixes: - Fix for List.Contains generates a bunch of ORs instead of more efficient IN clause in   LINQ to Entities (Oracle bug #14016344, MySql bug #64934). - Fix for error when trying to change the name of an Index on the Indexes/Keys editor; along with this fix now users can change the Index type of a new Index which could not be done   in previous versions, and when changing the Index name the change is reflected on the list view at the left side of the Index/Keys editor (Oracle bug #13613801). - Fix for stored procedure call using only its name with EF code first (MySql bug #64999, Oracle bug #14008699). - Fix for performance issue in generated EF query: .NET StartsWith/Contains/EndsWith produces MySql's locate instead of Like (MySql bug #64935, Oracle bug #14009363). - Fix for script generated for code first contains wrong alter table and wrong declaration for byte[] (MySql bug #64216, Oracle bug #13900091). - Fix for Exception thrown when using cascade delete in an EDM Model-First in Entity Framework (Oracle bug #14008752, MySql bug #64779). - Fix for Session locking issue with MySqlSessionStateStore (MySql bug #63997, Oracble bug #13733054). - Fixed deleting a user profile using Profile provider (MySQL bug #64409, Oracle bug #13790123). - Fix for bug Cannot Create an Entity with a Key of Type String (MySQL bug #65289, Oracle bug #14540202). This fix checks if the type has a FixedLength facet set in order to create a char otherwise should create varchar, mediumtext or longtext types when using a String CLR type in Code First or Model First also tested in Database First. Unit tests added for Code First and ProviderManifest. - Fix for bug "CacheServerProperties can cause 'Packet too large' error" (MySQL Bug #66578 Orabug #14593547). - Fix for handling unnamed parameter in MySQLCommand. This fix allows the mysqlcommand to handle parameters without requiring naming (e.g. INSERT INTO Test (id,name) VALUES (?, ?) ) (MySQL Bug #66060, Oracle bug #14499549). - Fixed inheritance on Entity Framework Code First scenarios. Discriminator column is created using its correct type as varchar(128) (MySql bug #63920 and Oracle bug #13582335). - Fixed "Trying to customize column precision in Code First does not work" (MySql bug #65001, Oracle bug #14469048). - Fixed bug ASP.NET Membership database fails on MySql database UTF32 (MySQL bug #65144, Oracle bug #14495292). - Fix for MySqlCommand.LastInsertedId holding only 32 bit values (MySql bug #65452, Oracle bug #14171960) by changing   several internal declaration of lastinsertid from int to long. - Fixed "Decimal type should have digits at right of decimal point", now default is 2, but user's changes in   EDM designer are recognized (MySql bug #65127, Oracle bug #14474342). - Fix for NullReferenceException when saving an uninitialized row in Entity Framework (MySql bug #66066, Oracle bug #14479715). - Fix for error when calling RoleProvider.RemoveUserFromRole(): causes an exception due to a wrong table being used (MySql bug #65805, Oracle bug #14405338). - Fix for "Memory Leak on MySql.Data.MySqlClient.MySqlCommand", too many MemoryStream's instances created (MySql bug #65696, Oracle bug #14468204). - Small improvement on MySqlPoolManager CleanIdleConnections for better mysqlpoolmanager idlecleanuptimer at startup (MySql bug #66472 and Oracle bug #14652624). - Fix for bug TIMESTAMP values are mistakenly represented as DateTime with Kind = Local (Mysql bug #66964, Oracle bug #14740705). - Fix for bug Keyword not supported. Parameter name: AttachDbFilename (Mysql bug #66880, Oracle bug #14733472). - Added support to MySql script file to retrieve data when using "SHOW" statements. - Fix for Package Load Failure in Visual Studio 2005 (MySql bug #63073, Oracle bug #13491674). - Fix for bug "Unable to connect using IPv6 connections" (MySQL bug #67253, Oracle bug #14835718). - Added auto-generated values for Guid identity columns (MySql bug #67450, Oracle bug #15834176). - Fix for method FirstOrDefault not supported in some LINQ to Entities queries (MySql bug #67377, Oracle bug #15856964). The release is available to download at http://dev.mysql.com/downloads/connector/net/6.4.html Documentation ------------------------------------- You can view current Connector/Net documentation at http://dev.mysql.com/doc/refman/5.5/en/connector-net.html You can find our team blog at http://blogs.oracle.com/MySQLOnWindows. You can also post questions on our forums at http://forums.mysql.com/. Enjoy and thanks for the support!

    Read the article

  • Announcing ASP.NET MVC 3 (Release Candidate 2)

    - by ScottGu
    Earlier today the ASP.NET team shipped the final release candidate (RC2) for ASP.NET MVC 3.  You can download and install it here. Almost there… Today’s RC2 release is the near-final release of ASP.NET MVC 3, and is a true “release candidate” in that we are hoping to not make any more code changes with it.  We are publishing it today so that people can do final testing with it, let us know if they find any last minute “showstoppers”, and start updating their apps to use it.  We will officially ship the final ASP.NET MVC 3 “RTM” build in January. Works with both VS 2010 and VS 2010 SP1 Beta Today’s ASP.NET MVC 3 RC2 release works with both the shipping version of Visual Studio 2010 / Visual Web Developer 2010 Express, as well as the newly released VS 2010 SP1 Beta.  This means that you do not need to install VS 2010 SP1 (or the SP1 beta) in order to use ASP.NET MVC 3.  It works just fine with the shipping Visual Studio 2010.  I’ll do a blog post next week, though, about some of the nice additional feature goodies that come with VS 2010 SP1 (including IIS Express and SQL CE support within VS) which make the dev experience for both ASP.NET Web Forms and ASP.NET MVC even better. Bugs and Perf Fixes Today’s ASP.NET MVC 3 RC2 build contains many bug fixes and performance optimizations.  Our latest performance tests indicate that ASP.NET MVC 3 is now faster than ASP.NET MVC 2, and that existing ASP.NET MVC applications will experience a slight performance increase when updated to run using ASP.NET MVC 3. Final Tweaks and Fit-N-Finish In addition to bug fixes and performance optimizations, today’s RC2 build contains a number of last-minute feature tweaks and “fit-n-finish” changes for the new ASP.NET MVC 3 features.  The feedback and suggestions we’ve received during the public previews has been invaluable in guiding these final tweaks, and we really appreciate people’s support in sending this feedback our way.  Below is a short-list of some of the feature changes/tweaks made between last month’s ASP.NET MVC 3 RC release and today’s ASP.NET MVC 3 RC2 release: jQuery updates and addition of jQuery UI The default ASP.NET MVC 3 project templates have been updated to include jQuery 1.4.4 and jQuery Validation 1.7.  We are also excited to announce today that we are including jQuery UI within our default ASP.NET project templates going forward.  jQuery UI provides a powerful set of additional UI widgets and capabilities.  It will be added by default to your project’s \scripts folder when you create new ASP.NET MVC 3 projects. Improved View Scaffolding The T4 templates used for scaffolding views with the Add-View dialog now generates views that use Html.EditorFor instead of helpers such as Html.TextBoxFor. This change enables you to optionally annotate models with metadata (using data annotation attributes) to better customize the output of your UI at runtime. The Add View scaffolding also supports improved detection and usage of primary key information on models (including support for naming conventions like ID, ProductID, etc).  For example: the Add View dialog box uses this information to ensure that the primary key value is not scaffold as an editable form field, and that links between views are auto-generated correctly with primary key information. The default Edit and Create templates also now include references to the jQuery scripts needed for client validation.  Scaffold form views now support client-side validation by default (no extra steps required).  Client-side validation with ASP.NET MVC 3 is also done using an unobtrusive javascript approach – making pages fast and clean. [ControllerSessionState] –> [SessionState] ASP.NET MVC 3 adds support for session-less controllers.  With the initial RC you used a [ControllerSessionState] attribute to specify this.  We shortened this in RC2 to just be [SessionState]: Note that in addition to turning off session state, you can also set it to be read-only (which is useful for webfarm scenarios where you are reading but not updating session state on a particular request). [SkipRequestValidation] –> [AllowHtml] ASP.NET MVC includes built-in support to protect against HTML and Cross-Site Script Injection Attacks, and will throw an error by default if someone tries to post HTML content as input.  Developers need to explicitly indicate that this is allowed (and that they’ve hopefully built their app to securely support it) in order to enable it. With ASP.NET MVC 3, we are also now supporting a new attribute that you can apply to properties of models/viewmodels to indicate that HTML input is enabled, which enables much more granular protection in a DRY way.  In last month’s RC release this attribute was named [SkipRequestValidation].  With RC2 we renamed it to [AllowHtml] to make it more intuitive: Setting the above [AllowHtml] attribute on a model/viewmodel will cause ASP.NET MVC 3 to turn off HTML injection protection when model binding just that property. Html.Raw() helper method The new Razor view engine introduced with ASP.NET MVC 3 automatically HTML encodes output by default.  This helps provide an additional level of protection against HTML and Script injection attacks. With RC2 we are adding a Html.Raw() helper method that you can use to explicitly indicate that you do not want to HTML encode your output, and instead want to render the content “as-is”: ViewModel/View –> ViewBag ASP.NET MVC has (since V1) supported a ViewData[] dictionary within Controllers and Views that enables developers to pass information from a Controller to a View in a late-bound way.  This approach can be used instead of, or in combination with, a strongly-typed model class.  The below code demonstrates a common use case – where a strongly typed Product model is passed to the view in addition to two late-bound variables via the ViewData[] dictionary: With ASP.NET MVC 3 we are introducing a new API that takes advantage of the dynamic type support within .NET 4 to set/retrieve these values.  It allows you to use standard “dot” notation to specify any number of additional variables to be passed, and does not require that you create a strongly-typed class to do so.  With earlier previews of ASP.NET MVC 3 we exposed this API using a dynamic property called “ViewModel” on the Controller base class, and with a dynamic property called “View” within view templates.  A lot of people found the fact that there were two different names confusing, and several also said that using the name ViewModel was confusing in this context – since often you create strongly-typed ViewModel classes in ASP.NET MVC, and they do not use this API.  With RC2 we are exposing a dynamic property that has the same name – ViewBag – within both Controllers and Views.  It is a dynamic collection that allows you to pass additional bits of data from your controller to your view template to help generate a response.  Below is an example of how we could use it to pass a time-stamp message as well as a list of all categories to our view template: Below is an example of how our view template (which is strongly-typed to expect a Product class as its model) can use the two extra bits of information we passed in our ViewBag to generate the response.  In particular, notice how we are using the list of categories passed in the dynamic ViewBag collection to generate a dropdownlist of friendly category names to help set the CategoryID property of our Product object.  The above Controller/View combination will then generate an HTML response like below.    Output Caching Improvements ASP.NET MVC 3’s output caching system no longer requires you to specify a VaryByParam property when declaring an [OutputCache] attribute on a Controller action method.  MVC3 now automatically varies the output cached entries when you have explicit parameters on your action method – allowing you to cleanly enable output caching on actions using code like below: In addition to supporting full page output caching, ASP.NET MVC 3 also supports partial-page caching – which allows you to cache a region of output and re-use it across multiple requests or controllers.  The [OutputCache] behavior for partial-page caching was updated with RC2 so that sub-content cached entries are varied based on input parameters as opposed to the URL structure of the top-level request – which makes caching scenarios both easier and more powerful than the behavior in the previous RC. @model declaration does not add whitespace In earlier previews, the strongly-typed @model declaration at the top of a Razor view added a blank line to the rendered HTML output. This has been fixed so that the declaration does not introduce whitespace. Changed "Html.ValidationMessage" Method to Display the First Useful Error Message The behavior of the Html.ValidationMessage() helper was updated to show the first useful error message instead of simply displaying the first error. During model binding, the ModelState dictionary can be populated from multiple sources with error messages about the property, including from the model itself (if it implements IValidatableObject), from validation attributes applied to the property, and from exceptions thrown while the property is being accessed. When the Html.ValidationMessage() method displays a validation message, it now skips model-state entries that include an exception, because these are generally not intended for the end user. Instead, the method looks for the first validation message that is not associated with an exception and displays that message. If no such message is found, it defaults to a generic error message that is associated with the first exception. RemoteAttribute “Fields” -> “AdditionalFields” ASP.NET MVC 3 includes built-in remote validation support with its validation infrastructure.  This means that the client-side validation script library used by ASP.NET MVC 3 can automatically call back to controllers you expose on the server to determine whether an input element is indeed valid as the user is editing the form (allowing you to provide real-time validation updates). You can accomplish this by decorating a model/viewmodel property with a [Remote] attribute that specifies the controller/action that should be invoked to remotely validate it.  With the RC this attribute had a “Fields” property that could be used to specify additional input elements that should be sent from the client to the server to help with the validation logic.  To improve the clarity of what this property does we have renamed it to “AdditionalFields” with today’s RC2 release. ViewResult.Model and ViewResult.ViewBag Properties The ViewResult class now exposes both a “Model” and “ViewBag” property off of it.  This makes it easier to unit test Controllers that return views, and avoids you having to access the Model via the ViewResult.ViewData.Model property. Installation Notes You can download and install the ASP.NET MVC 3 RC2 build here.  It can be installed on top of the previous ASP.NET MVC 3 RC release (it should just replace the bits as part of its setup). The one component that will not be updated by the above setup (if you already have it installed) is the NuGet Package Manager.  If you already have NuGet installed, please go to the Visual Studio Extensions Manager (via the Tools –> Extensions menu option) and click on the “Updates” tab.  You should see NuGet listed there – please click the “Update” button next to it to have VS update the extension to today’s release. If you do not have NuGet installed (and did not install the ASP.NET MVC RC build), then NuGet will be installed as part of your ASP.NET MVC 3 setup, and you do not need to take any additional steps to make it work. Summary We are really close to the final ASP.NET MVC 3 release, and will deliver the final “RTM” build of it next month.  It has been only a little over 7 months since ASP.NET MVC 2 shipped, and I’m pretty amazed by the huge number of new features, improvements, and refinements that the team has been able to add with this release (Razor, Unobtrusive JavaScript, NuGet, Dependency Injection, Output Caching, and a lot, lot more).  I’ll be doing a number of blog posts over the next few weeks talking about many of them in more depth. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • howto only tunnel specific hosts route through openvpn client on tomato

    - by kcome
    I am relatively newbie in networking world although I did coding and know some sysadmin background for a long time. and here I'm only one step from my destination. The whole picture is : at home I use one LinkSys E3000 as the gateway(don't know yet if this is it's name), wireless AP and no other routing/switching devices. It serves 1 PC and 1 Mac with LAN, 1 Mac Mini + 1 iPad + 2 smartphones with WIFI. My goal is use an openvpn client on the E3000 (with tomato firmware) and make my iPad and smartphone's all WiFi traffic through it, and other devices route remain the same non-openvpn route. So far I'm able to connect openvpn client on E3000 to an openvpn server, tunnel all my devices' all traffic through that openvpn connection. What's left is howto selectively route by source IP (at least in my guessing) to the tunnel while don't bother others. I had learned some 'iptables' and 'route' in past few days however without much luck, so here comes my question. Here are some info which will help you get the structure. ifconfig -a output, some useless lines striped, and in the web interface C0:C1:C0:1A:E0:28 is WAN, C0:C1:C0:1A:E0:27 is LAN, C0:C1:C0:1A:E0:29 is 2.4G wifi AP, C0:C1:C0:1A:E0:2A is 5G wifi AP. root@router:/tmp/home/root# ifconfig -a br0 Link encap:Ethernet HWaddr C0:C1:C0:1A:E0:27 inet addr:192.168.1.1 Bcast:192.168.1.255 Mask:255.255.255.0 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 eth0 Link encap:Ethernet HWaddr C0:C1:C0:1A:E0:27 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 eth1 Link encap:Ethernet HWaddr C0:C1:C0:1A:E0:29 UP BROADCAST RUNNING ALLMULTI MULTICAST MTU:1500 Metric:1 eth2 Link encap:Ethernet HWaddr C0:C1:C0:1A:E0:2A UP BROADCAST RUNNING ALLMULTI MULTICAST MTU:1500 Metric:1 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host ppp0 Link encap:Point-to-Point Protocol inet addr:172.200.1.43 P-t-P:172.200.0.1 Mask:255.255.255.255 UP POINTOPOINT RUNNING MULTICAST MTU:1480 Metric:1 vlan1 Link encap:Ethernet HWaddr C0:C1:C0:1A:E0:27 UP BROADCAST RUNNING ALLMULTI MULTICAST MTU:1500 Metric:1 vlan2 Link encap:Ethernet HWaddr C0:C1:C0:1A:E0:28 UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 wl0.1 Link encap:Ethernet HWaddr C0:C1:C0:1A:E0:29 BROADCAST MULTICAST MTU:1500 Metric:1 brctl show output root@router:/tmp/home/root# brctl show bridge name bridge id STP enabled interfaces br0 8000.c0c1c01ae027 no vlan1 eth1 eth2 before openvpn route-up script root@router:/tmp/home/root# route -n Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 172.200.0.1 0.0.0.0 255.255.255.255 UH 0 0 0 ppp0 192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 br0 127.0.0.0 0.0.0.0 255.0.0.0 U 0 0 0 lo 0.0.0.0 172.200.0.1 0.0.0.0 UG 0 0 0 ppp0 openvpn server push PUSH: Received control message: 'PUSH_REPLY,redirect-gateway,dhcp-option DNS 8.8.8.8,route 172.20.0.1,topology net30,ping 10,ping-restart 120,ifconfig 172.20.0.6 172.20.0.5' openvpn's stock route-up script Apr 24 14:52:06 router daemon.notice openvpn[1768]: /sbin/ifconfig tun11 172.20.0.6 pointopoint 172.20.0.5 mtu 1500 Apr 24 14:52:08 router daemon.notice openvpn[1768]: /sbin/route add -net 72.14.177.29 netmask 255.255.255.255 gw 172.200.0.1 Apr 24 14:52:08 router daemon.notice openvpn[1768]: /sbin/route add -net 0.0.0.0 netmask 128.0.0.0 gw 172.20.0.5 Apr 24 14:52:08 router daemon.notice openvpn[1768]: /sbin/route add -net 128.0.0.0 netmask 128.0.0.0 gw 172.20.0.5 Apr 24 14:52:08 router daemon.notice openvpn[1768]: /sbin/route add -net 172.20.0.1 netmask 255.255.255.255 gw 172.20.0.5 route after openvpn root@router:/tmp/home/root# route -n Kernel IP routing table Destination Gateway Genmask Flags Metric Ref Use Iface 172.20.0.5 0.0.0.0 255.255.255.255 UH 0 0 0 tun11 72.14.177.29 172.200.0.1 255.255.255.255 UGH 0 0 0 ppp0 172.200.0.1 0.0.0.0 255.255.255.255 UH 0 0 0 ppp0 172.20.0.1 172.20.0.5 255.255.255.255 UGH 0 0 0 tun11 192.168.1.0 0.0.0.0 255.255.255.0 U 0 0 0 br0 127.0.0.0 0.0.0.0 255.0.0.0 U 0 0 0 lo 0.0.0.0 172.20.0.5 128.0.0.0 UG 0 0 0 tun11 128.0.0.0 172.20.0.5 128.0.0.0 UG 0 0 0 tun11 0.0.0.0 172.200.0.1 0.0.0.0 UG 0 0 0 ppp0 something I had noticed and tried: * on the web interface of openvpn client there is an option "Create NAT on tunnel", if i check this, there is the following script (probably executed after openvpn connection established) root@router:/tmp/home/root# cat /tmp/etc/openvpn/fw/client1-fw.sh #!/bin/sh iptables -I INPUT -i tun11 -j ACCEPT iptables -I FORWARD -i tun11 -j ACCEPT iptables -t nat -I POSTROUTING -s 192.168.1.0/255.255.255.0 -o tun11 -j MASQUERADE if i uncheck this option, the last line will not appear. Then I guess probably the my issue will be solved by iptables and NAT related commands, I just haven't got enough knowledge to figure them out. I tried run iptables -t nat -I POSTROUTING -s 192.168.1.6 -o tun11 -j MASQUERADE manually after openvpn connected (192.168.1.6 is the ip address of my iPad), then my iPad get internet with openvpn tunnel, however all other devices can't reach internet. in case if needed, here is the iptables about NAT root@router:/tmp/home/root# iptables -t nat -L -n Chain PREROUTING (policy ACCEPT) target prot opt source destination DROP all -- 0.0.0.0/0 192.168.1.0/24 WANPREROUTING all -- 0.0.0.0/0 172.200.1.43 upnp all -- 0.0.0.0/0 172.200.1.43 Chain POSTROUTING (policy ACCEPT) target prot opt source destination MASQUERADE all -- 0.0.0.0/0 0.0.0.0/0 SNAT all -- 192.168.1.0/24 192.168.1.0/24 to:192.168.1.1 Chain OUTPUT (policy ACCEPT) target prot opt source destination Chain WANPREROUTING (1 references) target prot opt source destination DNAT icmp -- 0.0.0.0/0 0.0.0.0/0 to:192.168.1.1 Chain upnp (1 references) target prot opt source destination DNAT udp -- 0.0.0.0/0 0.0.0.0/0 udp dpt:5353 to:192.168.1.3:5353 Thanks in advance for helping and read this so much, I hope i made every info you need to give a help :)

    Read the article

  • Social Business Forum Milano: Day 2

    - by me
    @YourService. The business world has flipped and small business can capitalize  by Frank Eliason (twitter: @FrankEliason ) Technology and social media tools have made it easier than ever for companies to communicate with consumers. They can listen and join in on conversations, solve problems, get instant feedback about their products and services, and more. So why, then, are most companies not doing this? Instead, it seems as if customer service is at an all time low, and that the few companies who are choosing to focus on their customers are experiencing a great competitive advantage. At Your Service explains the importance of refocusing your business on your customers and your employees, and just how to do it. Explains how to create a culture of empowered employees who understand the value of a great customer experience Advises on the need to communicate that experience to their customers and potential customers Frank Eliason, recognized by BusinessWeek as the 'most famous customer service manager in the US, possibly in the world,' has built a reputation for helping large businesses improve the way they connect with customers and enhance their relationships Quotes from the Audience: Bertrand Duperrin ?@bduperrin social service is not about shutting up the loudest cutsomers ! #sbf12 @frankeliason Paolo Pelloni ?@paolopelloniGautam Ghosh ?@GautamGhosh RT @cecildijoux: #sbf12 @frankeliason you need to change things and fix the approach it's not about social media it's about driving change  Peter H. Reiser ?@peterreiser #sbf12 Company Experience = Product Experience + Customer Interactions + Employee Experience @yourservice Engage or lose! Socialize, mobilize, conversify: engage your employees to improve business performance Christian Finn (twitter: @cfinn) First Christian was presenting the flying monkey   Then he outlined the four principals to fix the Intranet: 1. Socalize the Intranet 2. Get Thee to a Single Repository 3. Mobilize the Intranet 4. Conversationalize Your Processes Quotes from the Audience: Oscar Berg ?@oscarberg Engaged employees think their work bring out the best of their ideas @cfinn #sbf12 http://pic.twitter.com/68eddp48 John Stepper ?@johnstepper I like @cfinn's "conversify your processes" A nice related concept to "narrating your work", part of working out loud. http://johnstepper.com/2012/05/26/working-out-loud-your-personal-content-strategy/ Oscar Berg ?@oscarberg Organizations are talent markets - socializing your intranet makes this market function better @cfinn #sbf12 For profit, productivity, and personal benefit: creating a collaborative culture at Deutsche Bank John Stepper (twitter:@johnstepper) Driving adoption of collaboration + social media platforms at Deutsche Bank. John shared some great best practices on how to deploy an enterprise wide  community model  in a large company. He started with the most important question What is the commercial value of adding social ? Then he talked about the success of Community of Practices deployment and outlined some key use cases including the relevant measures to proof the ROI of the investment. Examples:  Community of practice -> measure: systematic collection of value stories  Self-service website  -> measure: based on representative models Optimizing asset inventory - > measure: Actual counts  This use case was particular interesting.  It is a crowd sourced spending/saving of infrastructure model.  User can cancel IT services they don't need (as example Software xx).  5% of the saving goes to social responsibility projects. The John outlined some  best practices on how to address the WIIFM (What's In It For Me) question of the individual users:  - change from hierarchy to graph -  working out loud = observable work + narrating  your work  - add social skills to career objectives - example: building a purposeful social network course/training as part of the job development curriculum And last but not least John gave some important tips on how to get senior management buy-in by establishing management sponsored division level collaboration boards which defines clear uses cases and measures. This divisional use cases are then implemented using a common social platform.  Thanks John - I learned a lot from your presentation!   Quotes from the Audience: Ana Silva ?@AnaDataGirl #sbf12 what's in it for individuals at Deutsche Bank? Shapping their reputations in a big org says @johnstepper #e20Ana Silva ?@AnaDataGirl Any reason why not? MT @magatorlibero #sbf12 is Deutsche B. experience on applying social inside company applicable to Italian people? Oscar Berg ?@oscarberg Your career is not a ladder, it is a network that opens up opportunities - @johnstepper #sbf12 Oscar Berg ?@oscarberg @johnstepper: Institutionalizing collaboration is next - collaboration woven into the fabric of daily work #sbf12 Ana Silva ?@AnaDataGirl #sbf12 @johnstepper talking about how Deutsche Bank is using #socbiz to build purposeful CoP & save money

    Read the article

  • Can I delete libc-bin?

    - by Balazs Szikszay
    Question is simple, I need to know because I cant upgrade/install anything, because it always says I have to uninstall/delete it to continue. It also says dont do it, if I dont know what I am doing. EDIT: szikszay@szikszay-Latitude-E5530-non-vPro:~$ sudo apt-get upgrade Reading package lists... Done Building dependency tree Reading state information... Done You might want to run ‘apt-get -f install’ to correct these. The following packages have unmet dependencies. ia32-libs-multiarch:i386 : Depends: libqtcore4:i386 but it is not installed Depends: libqtgui4:i386 but it is not installed Depends: libqt4-dbus:i386 but it is not installed Depends: libqt4-network:i386 but it is not installed Depends: libqt4-opengl:i386 but it is not installed Depends: libqt4-qt3support:i386 but it is not installed Depends: libqt4-script:i386 but it is not installed Depends: libqt4-scripttools:i386 but it is not installed Depends: libqt4-sql:i386 but it is not installed Depends: libqt4-svg:i386 but it is not installed Depends: libqt4-test:i386 but it is not installed Depends: libqt4-xml:i386 but it is not installed Depends: libqt4-xmlpatterns:i386 but it is not installed Depends: libcups2:i386 but it is not installed Depends: libcupsimage2:i386 but it is not installed Depends: libcurl3:i386 but it is not installed Depends: libnss3:i386 but it is not installed Depends: libnspr4:i386 but it is not installed Depends: libssl1.0.0:i386 but it is not installed Recommends: libgl1-mesa-glx:i386 but it is not installed Recommends: libgl1-mesa-dri:i386 but it is not installed lib32ffi6 : Depends: libc6-i386 (= 2.4) but it is not installed lib32gcc1 : Depends: libc6-i386 (= 2.5) but it is not installed lib32nss-mdns : Depends: libc6-i386 (= 2.4) but it is not installed lib32stdc++6 : Depends: libc6-i386 (= 2.4) but it is not installed lib32z1 : Depends: libc6-i386 (= 2.4) but it is not installed libacl1:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libattr1:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libaudio2:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libavahi-client3:i386 : Depends: libc6:i386 (= 2.4) but it is not installed Depends: libdbus-1-3:i386 (= 1.1.1) but it is not installed libavahi-common3:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libcomerr2:i386 : Depends: libc6:i386 (= 2.12) but it is not installed libdb5.1:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libdrm-intel1:i386 : Depends: libc6:i386 (= 2.3.4) but it is not installed libdrm-nouveau1a:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed libdrm-radeon1:i386 : Depends: libc6:i386 (= 2.3.4) but it is not installed libdrm2:i386 : Depends: libc6:i386 (= 2.7) but it is not installed libffi6:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libfontconfig1:i386 : Depends: libc6:i386 (= 2.7) but it is not installed Depends: libexpat1:i386 (= 1.95.8) but it is not installed Depends: libfreetype6:i386 (= 2.2.1) but it is not installed libgcc1:i386 : Depends: libc6:i386 (= 2.2.4) but it is not installed libgcrypt11:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libgdbm3:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed libglib2.0-0:i386 : Depends: libc6:i386 (= 2.9) but it is not installed libgpg-error0:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed libice6:i386 : Depends: libc6:i386 (= 2.11) but it is not installed libidn11:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libjpeg62:i386 : Depends: libc6:i386 (= 2.7) but it is not installed libkeyutils1:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed liblcms1:i386 : Depends: libc6:i386 (= 2.7) but it is not installed libllvm2.9:i386 : Depends: libc6:i386 (= 2.11) but it is not installed libmng1:i386 : Depends: libc6:i386 (= 2.11) but it is not installed libpciaccess0:i386 : Depends: libc6:i386 (= 2.7) but it is not installed libpcre3:i386 : Depends: libc6:i386 (= 2.4) but it is not installed librtmp0:i386 : Depends: libc6:i386 (= 2.7) but it is not installed Depends: libgnutls26:i386 (= 2.9.11-0) but it is not installed libsasl2-2:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libsasl2-modules:i386 : Depends: libc6:i386 (= 2.4) but it is not installed Depends: libssl1.0.0:i386 (= 1.0.0) but it is not installed libselinux1:i386 : Depends: libc6:i386 (= 2.8) but it is not installed libsm6:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libsqlite3-0:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libstdc++6:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libuuid1:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libx11-6:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libxau6:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libxcb1:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libxdamage1:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed libxdmcp6:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libxext6:i386 : Depends: libc6:i386 (= 2.4) but it is not installed libxfixes3:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed libxrender1:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed libxss1:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed libxt6:i386 : Depends: libc6:i386 (= 2.7) but it is not installed libxxf86vm1:i386 : Depends: libc6:i386 (= 2.1.3) but it is not installed zlib1g:i386 : Depends: libc6:i386 (= 2.4) but it is not installed E: Unmet dependencies. Try using -f. szikszay@szikszay-Latitude-E5530-non-vPro:~$ sudo apt-get upgrade -f Reading package lists... Done Building dependency tree Reading state information... Done Correcting dependencies... Done The following packages will be REMOVED libc-bin The following NEW packages will be installed libc-bin:i386 libc6:i386 libc6-i386 libcups2:i386 libcupsimage2:i386 libcurl3:i386 libdbus-1-3:i386 libexpat1:i386 libfreetype6:i386 libgl1-mesa-dri:i386 libgl1-mesa-glx:i386 libglapi-mesa:i386 libgnutls26:i386 libgssapi-krb5-2:i386 libk5crypto3:i386 libkrb5-3:i386 libkrb5support0:i386 libldap-2.4-2:i386 libnspr4:i386 libnss3:i386 libpng12-0:i386 libqt4-dbus:i386 libqt4-declarative:i386 libqt4-designer:i386 libqt4-network:i386 libqt4-opengl:i386 libqt4-qt3support:i386 libqt4-script:i386 libqt4-scripttools:i386 libqt4-sql:i386 libqt4-svg:i386 libqt4-test:i386 libqt4-xml:i386 libqt4-xmlpatterns:i386 libqtcore4:i386 libqtgui4:i386 libssl1.0.0:i386 libtasn1-3:i386 libtiff4:i386 libxi6:i386 The following packages have been kept back: ginn libgrip0 linux-headers-generic linux-image-generic unity unity-common xserver-xorg-input-evdev xserver-xorg-input-synaptics The following packages will be upgraded: accountsservice acpi-support acpid aisleriot alsa-utils app-install-data-partner apparmor appmenu-qt apport apport-gtk apt apt-transport-https apt-utils aptdaemon aptdaemon-data apturl apturl-common at-spi2-core bamfdaemon banshee banshee-extension-soundmenu banshee-extension-ubuntuonemusicstore baobab bind9-host binutils bluez bluez-alsa bluez-cups bluez-gstreamer brasero brasero-cdrkit brasero-common brltty bzip2 ca-certificates-java checkbox checkbox-gtk colord command-not-found command-not-found-data compiz compiz-core compiz-gnome compiz-plugins-default compiz-plugins-main-default cups cups-bsd cups-client cups-common cups-ppdc dbus dbus-x11 deja-dup desktop-file-utils dnsutils dpkg ecryptfs-utils empathy empathy-common eog evince evince-common evolution-data-server evolution-data-server-common file-roller firefox firefox-globalmenu firefox-gnome-support firefox-locale-en firefox-locale-hu gbrainy gcalctool gconf2 gconf2-common gedit gedit-common ghostscript ghostscript-cups ghostscript-x gir1.2-atspi-2.0 gir1.2-gconf-2.0 gir1.2-gnomebluetooth-1.0 gir1.2-gtk-3.0 gir1.2-gtksource-3.0 gir1.2-totem-1.0 gir1.2-unity-4.0 gir1.2-webkit-3.0 gnome-accessibility-themes gnome-bluetooth gnome-control-center gnome-control-center-data gnome-desktop3-data gnome-font-viewer gnome-games-common gnome-icon-theme gnome-keyring gnome-mahjongg gnome-online-accounts gnome-orca gnome-power-manager gnome-screenshot gnome-search-tool gnome-session gnome-session-bin gnome-session-canberra gnome-session-common gnome-settings-daemon gnome-sudoku gnome-system-log gnome-system-monitor gnome-utils-common gnomine gnupg gpgv grub-common grub-pc grub-pc-bin grub2-common gstreamer0.10-gconf gstreamer0.10-plugins-good gstreamer0.10-pulseaudio gvfs gvfs-backends gvfs-bin gvfs-fuse gwibber gwibber-service gwibber-service-facebook gwibber-service-identica gwibber-service-twitter gzip hpijs hplip hplip-cups hplip-data icedtea-6-jre-cacao icedtea-6-jre-jamvm icedtea-netx ifupdown im-switch indicator-datetime indicator-session indicator-sound initramfs-tools initramfs-tools-bin initscripts insserv isc-dhcp-client isc-dhcp-common iso-codes jockey-common jockey-gtk language-pack-en language-pack-en-base language-pack-gnome-en language-pack-gnome-en-base language-pack-gnome-hu language-pack-gnome-hu-base language-pack-hu language-pack-hu-base language-selector-common language-selector-gnome libaccountsservice0 libapt-inst1.3 libapt-pkg4.11 libarchive1 libasound2-plugins libatk-adaptor libatspi2.0-0 libbamf0 libbamf3-0 libbind9-60 libbluetooth3 libbrasero-media3-1 libbrlapi0.5 libbz2-1.0 libc-dev-bin libc6 libc6-dev libcamel-1.2-29 libcanberra-gtk-module libcanberra-gtk0 libcanberra-gtk3-0 libcanberra-gtk3-module libcanberra-pulse libcanberra0 libcolord1 libcups2 libcupscgi1 libcupsdriver1 libcupsimage2 libcupsmime1 libcupsppdc1 libcurl3-gnutls libdbus-1-3 libdbus-glib-1-2 libdecoration0 libdns69 libebackend-1.2-1 libebook1.2-12 libecal1.2-10 libecryptfs0 libedata-book-1.2-11 libedata-cal-1.2-13 libedataserver1.2-15 libedataserverui-3.0-1 libevince3-3 libexif12 libexpat1 libfreetype6 libgail-3-0 libgail-3-common libgck-1-0 libgconf2-4 libgcr-3-1 libgdata-common libgdata13 libgl1-mesa-dri libgl1-mesa-glx libglapi-mesa libglu1-mesa libgnome-bluetooth8 libgnome-control-center1 libgnome-desktop-3-2 libgnutls26 libgoa-1.0-0 libgs9 libgs9-common libgssapi-krb5-2 libgtk-3-0 libgtk-3-bin libgtk-3-common libgtksourceview-3.0-0 libgtksourceview-3.0-common libgudev-1.0-0 libgweather-3-0 libgweather-common libgwibber-gtk2 libgwibber2 libhpmud0 libicu44 libimobiledevice2 libisc62 libisccc60 libisccfg62 libjasper1 libjs-jquery libk5crypto3 libkrb5-3 libkrb5support0 libldap-2.4-2 liblightdm-gobject-1-0 liblwres60 libmetacity-private0 libmission-control-plugins0 libmono-cairo4.0-cil libmono-corlib4.0-cil libmono-csharp4.0-cil libmono-i18n-west4.0-cil libmono-i18n4.0-cil libmono-posix4.0-cil libmono-security4.0-cil libmono-sharpzip4.84-cil libmono-system-configuration4.0-cil libmono-system-core4.0-cil libmono-system-drawing4.0-cil libmono-system-security4.0-cil libmono-system-xml4.0-cil libmono-system4.0-cil libmono-zeroconf1.0-cil libmysqlclient16 libnautilus-extension1 libncurses5 libncursesw5 libnm-glib-vpn1 libnm-glib4 libnm-gtk-common libnm-gtk0 libnm-util2 libnotify0.4-cil libnspr4 libnss3 libnss3-1d libnux-1.0-0 libnux-1.0-common libpam-gnome-keyring libpam-modules libpam-modules-bin libpam-runtime libpam0g libperl5.12 libpng12-0 libpoppler-glib6 libpoppler13 libproxy0 libpulse-mainloop-glib0 libpulse0 libpurple-bin libpurple0 libpython2.7 libqt4-dbus libqt4-declarative libqt4-network libqt4-opengl libqt4-script libqt4-sql libqt4-sql-mysql libqt4-svg libqt4-xml libqt4-xmlpatterns libqtcore4 libqtgui4 libreoffice-base-core libreoffice-calc libreoffice-common libreoffice-core libreoffice-draw libreoffice-emailmerge libreoffice-gnome libreoffice-gtk libreoffice-help-en-gb libreoffice-help-en-us libreoffice-help-hu libreoffice-impress libreoffice-l10n-common libreoffice-l10n-en-gb libreoffice-l10n-en-za libreoffice-l10n-hu libreoffice-math libreoffice-style-human libreoffice-writer libsane-hpaio libsmbclient libsnmp-base libsnmp15 libssl1.0.0 libsyncdaemon-1.0-1 libt1-5 libtasn1-3 libtiff4 libtinfo5 libtotem0 libubuntuone-1.0-1 libubuntuone1.0-cil libudev0 libunity-core-4.0-4 libunity6 libusbmuxd1 libv4l-0 libvorbis0a libvorbisenc2 libvorbisfile3 libwbclient0 libwebkitgtk-1.0-0 libwebkitgtk-1.0-common libwebkitgtk-3.0-0 libwebkitgtk-3.0-common libxi6 libxml2 libxslt1.1 lightdm linux-firmware linux-libc-dev mawk metacity metacity-common mobile-broadband-provider-info modemmanager mono-4.0-gac mono-gac mono-runtime mousetweaks multiarch-support mysql-common nautilus nautilus-data nautilus-sendto-empathy ncurses-base ncurses-bin network-manager network-manager-gnome nux-tools onboard openjdk-6-jre openjdk-6-jre-headless openjdk-6-jre-lib openssl perl perl-base perl-modules poppler-utils pulseaudio pulseaudio-esound-compat pulseaudio-module-bluetooth pulseaudio-module-gconf pulseaudio-module-x11 pulseaudio-utils python-apport python-aptdaemon python-aptdaemon-gtk python-aptdaemon.gtk3widgets python-aptdaemon.gtkwidgets python-brlapi python-crypto python-cups python-cupshelpers python-egenix-mxdatetime python-egenix-mxtools python-gobject python-gobject-cairo python-httplib2 python-keyring python-launchpadlib python-libproxy python-libxml2 python-pam python-papyon python-pkg-resources python-problem-report python-pyatspi2 python-software-properties python-ubuntuone-client python-ubuntuone-storageprotocol python-uno python2.7 python2.7-minimal qdbus samba-common samba-common-bin seahorse shotwell simple-scan smbclient sni-qt software-center software-properties-common software-properties-gtk sudo system-config-printer-common system-config-printer-gnome system-config-printer-udev sysv-rc sysvinit-utils telepathy-indicator telepathy-mission-control-5 thunderbird thunderbird-globalmenu thunderbird-gnome-support thunderbird-locale-en thunderbird-locale-en-gb thunderbird-locale-en-us thunderbird-locale-hu tomboy totem totem-common totem-mozilla totem-plugins transmission-common transmission-gtk ttf-opensymbol tzdata tzdata-java ubuntu-desktop ubuntu-docs ubuntu-minimal ubuntu-sso-client ubuntu-standard ubuntuone-client ubuntuone-client-gnome ubuntuone-couch udev unity-lens-applications unity-services uno-libs3 update-manager update-manager-core update-notifier update-notifier-common upstart ure usbmuxd vim-common vim-tiny vinagre vino whois x11-common xdiagnose xorg xserver-common xserver-xorg xserver-xorg-core xserver-xorg-input-all xserver-xorg-video-all xserver-xorg-video-intel xserver-xorg-video-openchrome xserver-xorg-video-qxl xul-ext-ubufox WARNING: The following essential packages will be removed. This should NOT be done unless you know exactly what you are doing! libc-bin 498 upgraded, 40 newly installed, 1 to remove and 8 not upgraded. 69 not fully installed or removed. Need to get 439 MB of archives. After this operation, 135 MB of additional disk space will be used. You are about to do something potentially harmful To continue type in the phrase ‘Yes, do as I say!’ ?] I tried to upgrade but it gives me an error, when i try to upgrade-f it says i should delete libc-bin. Thanks for the answers btw. EDIT2: it also says this: The package system is broken If you are using third party repositories then disable them, since they are a common source of problems. Now run the following command in a terminal: apt-get install -f

    Read the article

  • Residual packages Ubuntu 12.04

    - by hydroxide
    I have an Asus Q500A with win8 and Ubuntu 12.04 64 bit; Linux kernel 3.8.0-32-generic. I have been having residual package issues which have been giving me trouble trying to reconfigure xserver-xorg-lts-raring. I tried removing all residual packages from synaptic but the following were not removed. Output of sudo dpkg -l | grep "^rc" rc gstreamer0.10-plugins-good:i386 0.10.31-1ubuntu1.2 GStreamer plugins from the "good" set rc libaa1:i386 1.4p5-39ubuntu1 ASCII art library rc libaio1:i386 0.3.109-2ubuntu1 Linux kernel AIO access library - shared library rc libao4:i386 1.1.0-1ubuntu2 Cross Platform Audio Output Library rc libasn1-8-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - ASN.1 library rc libasound2:i386 1.0.25-1ubuntu10.2 shared library for ALSA applications rc libasyncns0:i386 0.8-4 Asynchronous name service query library rc libatk1.0-0:i386 2.4.0-0ubuntu1 ATK accessibility toolkit rc libavahi-client3:i386 0.6.30-5ubuntu2 Avahi client library rc libavahi-common3:i386 0.6.30-5ubuntu2 Avahi common library rc libavc1394-0:i386 0.5.3-1ubuntu2 control IEEE 1394 audio/video devices rc libcaca0:i386 0.99.beta17-2.1ubuntu2 colour ASCII art library rc libcairo-gobject2:i386 1.10.2-6.1ubuntu3 The Cairo 2D vector graphics library (GObject library) rc libcairo2:i386 1.10.2-6.1ubuntu3 The Cairo 2D vector graphics library rc libcanberra-gtk0:i386 0.28-3ubuntu3 GTK+ helper for playing widget event sounds with libcanberra rc libcanberra0:i386 0.28-3ubuntu3 simple abstract interface for playing event sounds rc libcap2:i386 1:2.22-1ubuntu3 support for getting/setting POSIX.1e capabilities rc libcdparanoia0:i386 3.10.2+debian-10ubuntu1 audio extraction tool for sampling CDs (library) rc libcroco3:i386 0.6.5-1ubuntu0.1 Cascading Style Sheet (CSS) parsing and manipulation toolkit rc libcups2:i386 1.5.3-0ubuntu8 Common UNIX Printing System(tm) - Core library rc libcupsimage2:i386 1.5.3-0ubuntu8 Common UNIX Printing System(tm) - Raster image library rc libcurl3:i386 7.22.0-3ubuntu4.3 Multi-protocol file transfer library (OpenSSL) rc libdatrie1:i386 0.2.5-3 Double-array trie library rc libdbus-glib-1-2:i386 0.98-1ubuntu1.1 simple interprocess messaging system (GLib-based shared library) rc libdbusmenu-qt2:i386 0.9.2-0ubuntu1 Qt implementation of the DBusMenu protocol rc libdrm-nouveau2:i386 2.4.43-0ubuntu0.0.3 Userspace interface to nouveau-specific kernel DRM services -- runtime rc libdv4:i386 1.0.0-3ubuntu1 software library for DV format digital video (runtime lib) rc libesd0:i386 0.2.41-10build3 Enlightened Sound Daemon - Shared libraries rc libexif12:i386 0.6.20-2ubuntu0.1 library to parse EXIF files rc libexpat1:i386 2.0.1-7.2ubuntu1.1 XML parsing C library - runtime library rc libflac8:i386 1.2.1-6 Free Lossless Audio Codec - runtime C library rc libfontconfig1:i386 2.8.0-3ubuntu9.1 generic font configuration library - runtime rc libfreetype6:i386 2.4.8-1ubuntu2.1 FreeType 2 font engine, shared library files rc libgail18:i386 2.24.10-0ubuntu6 GNOME Accessibility Implementation Library -- shared libraries rc libgconf-2-4:i386 3.2.5-0ubuntu2 GNOME configuration database system (shared libraries) rc libgcrypt11:i386 1.5.0-3ubuntu0.2 LGPL Crypto library - runtime library rc libgd2-xpm:i386 2.0.36~rc1~dfsg-6ubuntu2 GD Graphics Library version 2 rc libgdbm3:i386 1.8.3-10 GNU dbm database routines (runtime version) rc libgdk-pixbuf2.0-0:i386 2.26.1-1 GDK Pixbuf library rc libgif4:i386 4.1.6-9ubuntu1 library for GIF images (library) rc libgl1-mesa-dri-lts-quantal:i386 9.0.3-0ubuntu0.4~precise1 free implementation of the OpenGL API -- DRI modules rc libgl1-mesa-dri-lts-raring:i386 9.1.4-0ubuntu0.1~precise2 free implementation of the OpenGL API -- DRI modules rc libgl1-mesa-glx:i386 8.0.4-0ubuntu0.6 free implementation of the OpenGL API -- GLX runtime rc libgl1-mesa-glx-lts-quantal:i386 9.0.3-0ubuntu0.4~precise1 free implementation of the OpenGL API -- GLX runtime rc libgl1-mesa-glx-lts-raring:i386 9.1.4-0ubuntu0.1~precise2 free implementation of the OpenGL API -- GLX runtime rc libglapi-mesa:i386 8.0.4-0ubuntu0.6 free implementation of the GL API -- shared library rc libglapi-mesa-lts-quantal:i386 9.0.3-0ubuntu0.4~precise1 free implementation of the GL API -- shared library rc libglapi-mesa-lts-raring:i386 9.1.4-0ubuntu0.1~precise2 free implementation of the GL API -- shared library rc libglu1-mesa:i386 8.0.4-0ubuntu0.6 Mesa OpenGL utility library (GLU) rc libgnome-keyring0:i386 3.2.2-2 GNOME keyring services library rc libgnutls26:i386 2.12.14-5ubuntu3.5 GNU TLS library - runtime library rc libgomp1:i386 4.6.3-1ubuntu5 GCC OpenMP (GOMP) support library rc libgpg-error0:i386 1.10-2ubuntu1 library for common error values and messages in GnuPG components rc libgphoto2-2:i386 2.4.13-1ubuntu1.2 gphoto2 digital camera library rc libgphoto2-port0:i386 2.4.13-1ubuntu1.2 gphoto2 digital camera port library rc libgssapi-krb5-2:i386 1.10+dfsg~beta1-2ubuntu0.3 MIT Kerberos runtime libraries - krb5 GSS-API Mechanism rc libgssapi3-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - GSSAPI support library rc libgstreamer-plugins-base0.10-0:i386 0.10.36-1ubuntu0.1 GStreamer libraries from the "base" set rc libgstreamer0.10-0:i386 0.10.36-1ubuntu1 Core GStreamer libraries and elements rc libgtk2.0-0:i386 2.24.10-0ubuntu6 GTK+ graphical user interface library rc libgudev-1.0-0:i386 1:175-0ubuntu9.4 GObject-based wrapper library for libudev rc libhcrypto4-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - crypto library rc libheimbase1-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - Base library rc libheimntlm0-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - NTLM support library rc libhx509-5-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - X509 support library rc libibus-1.0-0:i386 1.4.1-3ubuntu1 Intelligent Input Bus - shared library rc libice6:i386 2:1.0.7-2build1 X11 Inter-Client Exchange library rc libidn11:i386 1.23-2 GNU Libidn library, implementation of IETF IDN specifications rc libiec61883-0:i386 1.2.0-0.1ubuntu1 an partial implementation of IEC 61883 rc libieee1284-3:i386 0.2.11-10build1 cross-platform library for parallel port access rc libjack-jackd2-0:i386 1.9.8~dfsg.1-1ubuntu2 JACK Audio Connection Kit (libraries) rc libjasper1:i386 1.900.1-13 JasPer JPEG-2000 runtime library rc libjpeg-turbo8:i386 1.1.90+svn733-0ubuntu4.2 IJG JPEG compliant runtime library. rc libjson0:i386 0.9-1ubuntu1 JSON manipulation library - shared library rc libk5crypto3:i386 1.10+dfsg~beta1-2ubuntu0.3 MIT Kerberos runtime libraries - Crypto Library rc libkeyutils1:i386 1.5.2-2 Linux Key Management Utilities (library) rc libkrb5-26-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - libraries rc libkrb5-3:i386 1.10+dfsg~beta1-2ubuntu0.3 MIT Kerberos runtime libraries rc libkrb5support0:i386 1.10+dfsg~beta1-2ubuntu0.3 MIT Kerberos runtime libraries - Support library rc liblcms1:i386 1.19.dfsg-1ubuntu3 Little CMS color management library rc libldap-2.4-2:i386 2.4.28-1.1ubuntu4.4 OpenLDAP libraries rc libllvm3.0:i386 3.0-4ubuntu1 Low-Level Virtual Machine (LLVM), runtime library rc libllvm3.1:i386 3.1-2ubuntu1~12.04.1 Low-Level Virtual Machine (LLVM), runtime library rc libllvm3.2:i386 3.2-2ubuntu5~precise1 Low-Level Virtual Machine (LLVM), runtime library rc libltdl7:i386 2.4.2-1ubuntu1 A system independent dlopen wrapper for GNU libtool rc libmad0:i386 0.15.1b-7ubuntu1 MPEG audio decoder library rc libmikmod2:i386 3.1.12-2 Portable sound library rc libmng1:i386 1.0.10-3 Multiple-image Network Graphics library rc libmpg123-0:i386 1.12.1-3.2ubuntu1 MPEG layer 1/2/3 audio decoder -- runtime library rc libmysqlclient18:i386 5.5.32-0ubuntu0.12.04.1 MySQL database client library rc libnspr4:i386 4.9.5-0ubuntu0.12.04.1 NetScape Portable Runtime Library rc libnss3:i386 3.14.3-0ubuntu0.12.04.1 Network Security Service libraries rc libodbc1:i386 2.2.14p2-5ubuntu3 ODBC library for Unix rc libogg0:i386 1.2.2~dfsg-1ubuntu1 Ogg bitstream library rc libopenal1:i386 1:1.13-4ubuntu3 Software implementation of the OpenAL API (shared library) rc liborc-0.4-0:i386 1:0.4.16-1ubuntu2 Library of Optimized Inner Loops Runtime Compiler rc libosmesa6:i386 8.0.4-0ubuntu0.6 Mesa Off-screen rendering extension rc libp11-kit0:i386 0.12-2ubuntu1 Library for loading and coordinating access to PKCS#11 modules - runtime rc libpango1.0-0:i386 1.30.0-0ubuntu3.1 Layout and rendering of internationalized text rc libpixman-1-0:i386 0.24.4-1 pixel-manipulation library for X and cairo rc libproxy1:i386 0.4.7-0ubuntu4.1 automatic proxy configuration management library (shared) rc libpulse-mainloop-glib0:i386 1:1.1-0ubuntu15.4 PulseAudio client libraries (glib support) rc libpulse0:i386 1:1.1-0ubuntu15.4 PulseAudio client libraries rc libqt4-dbus:i386 4:4.8.1-0ubuntu4.4 Qt 4 D-Bus module rc libqt4-declarative:i386 4:4.8.1-0ubuntu4.4 Qt 4 Declarative module rc libqt4-designer:i386 4:4.8.1-0ubuntu4.4 Qt 4 designer module rc libqt4-network:i386 4:4.8.1-0ubuntu4.4 Qt 4 network module rc libqt4-opengl:i386 4:4.8.1-0ubuntu4.4 Qt 4 OpenGL module rc libqt4-qt3support:i386 4:4.8.1-0ubuntu4.4 Qt 3 compatibility library for Qt 4 rc libqt4-script:i386 4:4.8.1-0ubuntu4.4 Qt 4 script module rc libqt4-scripttools:i386 4:4.8.1-0ubuntu4.4 Qt 4 script tools module rc libqt4-sql:i386 4:4.8.1-0ubuntu4.4 Qt 4 SQL module rc libqt4-svg:i386 4:4.8.1-0ubuntu4.4 Qt 4 SVG module rc libqt4-test:i386 4:4.8.1-0ubuntu4.4 Qt 4 test module rc libqt4-xml:i386 4:4.8.1-0ubuntu4.4 Qt 4 XML module rc libqt4-xmlpatterns:i386 4:4.8.1-0ubuntu4.4 Qt 4 XML patterns module rc libqtcore4:i386 4:4.8.1-0ubuntu4.4 Qt 4 core module rc libqtgui4:i386 4:4.8.1-0ubuntu4.4 Qt 4 GUI module rc libqtwebkit4:i386 2.2.1-1ubuntu4 Web content engine library for Qt rc libraw1394-11:i386 2.0.7-1ubuntu1 library for direct access to IEEE 1394 bus (aka FireWire) rc libroken18-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - roken support library rc librsvg2-2:i386 2.36.1-0ubuntu1 SAX-based renderer library for SVG files (runtime) rc librtmp0:i386 2.4~20110711.gitc28f1bab-1 toolkit for RTMP streams (shared library) rc libsamplerate0:i386 0.1.8-4 Audio sample rate conversion library rc libsane:i386 1.0.22-7ubuntu1 API library for scanners rc libsasl2-2:i386 2.1.25.dfsg1-3ubuntu0.1 Cyrus SASL - authentication abstraction library rc libsdl-image1.2:i386 1.2.10-3 image loading library for Simple DirectMedia Layer 1.2 rc libsdl-mixer1.2:i386 1.2.11-7 Mixer library for Simple DirectMedia Layer 1.2, libraries rc libsdl-net1.2:i386 1.2.7-5 Network library for Simple DirectMedia Layer 1.2, libraries rc libsdl-ttf2.0-0:i386 2.0.9-1.1ubuntu1 ttf library for Simple DirectMedia Layer with FreeType 2 support rc libsdl1.2debian:i386 1.2.14-6.4ubuntu3 Simple DirectMedia Layer rc libshout3:i386 2.2.2-7ubuntu1 MP3/Ogg Vorbis broadcast streaming library rc libsm6:i386 2:1.2.0-2build1 X11 Session Management library rc libsndfile1:i386 1.0.25-4 Library for reading/writing audio files rc libsoup-gnome2.4-1:i386 2.38.1-1 HTTP library implementation in C -- GNOME support library rc libsoup2.4-1:i386 2.38.1-1 HTTP library implementation in C -- Shared library rc libspeex1:i386 1.2~rc1-3ubuntu2 The Speex codec runtime library rc libspeexdsp1:i386 1.2~rc1-3ubuntu2 The Speex extended runtime library rc libsqlite3-0:i386 3.7.9-2ubuntu1.1 SQLite 3 shared library rc libssl0.9.8:i386 0.9.8o-7ubuntu3.1 SSL shared libraries rc libstdc++5:i386 1:3.3.6-25ubuntu1 The GNU Standard C++ Library v3 rc libstdc++6:i386 4.6.3-1ubuntu5 GNU Standard C++ Library v3 rc libtag1-vanilla:i386 1.7-1ubuntu5 audio meta-data library - vanilla flavour rc libtasn1-3:i386 2.10-1ubuntu1.1 Manage ASN.1 structures (runtime) rc libtdb1:i386 1.2.9-4 Trivial Database - shared library rc libthai0:i386 0.1.16-3 Thai language support library rc libtheora0:i386 1.1.1+dfsg.1-3ubuntu2 The Theora Video Compression Codec rc libtiff4:i386 3.9.5-2ubuntu1.5 Tag Image File Format (TIFF) library rc libtxc-dxtn-s2tc0:i386 0~git20110809-2.1 Texture compression library for Mesa rc libunistring0:i386 0.9.3-5 Unicode string library for C rc libusb-0.1-4:i386 2:0.1.12-20 userspace USB programming library rc libv4l-0:i386 0.8.6-1ubuntu2 Collection of video4linux support libraries rc libv4lconvert0:i386 0.8.6-1ubuntu2 Video4linux frame format conversion library rc libvisual-0.4-0:i386 0.4.0-4 Audio visualization framework rc libvorbis0a:i386 1.3.2-1ubuntu3 The Vorbis General Audio Compression Codec (Decoder library) rc libvorbisenc2:i386 1.3.2-1ubuntu3 The Vorbis General Audio Compression Codec (Encoder library) rc libvorbisfile3:i386 1.3.2-1ubuntu3 The Vorbis General Audio Compression Codec (High Level API) rc libwavpack1:i386 4.60.1-2 audio codec (lossy and lossless) - library rc libwind0-heimdal:i386 1.6~git20120311.dfsg.1-2ubuntu0.1 Heimdal Kerberos - stringprep implementation rc libwrap0:i386 7.6.q-21 Wietse Venema's TCP wrappers library rc libx11-6:i386 2:1.4.99.1-0ubuntu2.2 X11 client-side library rc libx11-xcb1:i386 2:1.4.99.1-0ubuntu2.2 Xlib/XCB interface library rc libxau6:i386 1:1.0.6-4 X11 authorisation library rc libxaw7:i386 2:1.0.9-3ubuntu1 X11 Athena Widget library rc libxcb-dri2-0:i386 1.8.1-1ubuntu0.2 X C Binding, dri2 extension rc libxcb-glx0:i386 1.8.1-1ubuntu0.2 X C Binding, glx extension rc libxcb-render0:i386 1.8.1-1ubuntu0.2 X C Binding, render extension rc libxcb-shm0:i386 1.8.1-1ubuntu0.2 X C Binding, shm extension rc libxcb1:i386 1.8.1-1ubuntu0.2 X C Binding rc libxcomposite1:i386 1:0.4.3-2build1 X11 Composite extension library rc libxcursor1:i386 1:1.1.12-1ubuntu0.1 X cursor management library rc libxdamage1:i386 1:1.1.3-2build1 X11 damaged region extension library rc libxdmcp6:i386 1:1.1.0-4 X11 Display Manager Control Protocol library rc libxext6:i386 2:1.3.0-3ubuntu0.1 X11 miscellaneous extension library rc libxfixes3:i386 1:5.0-4ubuntu4.1 X11 miscellaneous 'fixes' extension library rc libxft2:i386 2.2.0-3ubuntu2 FreeType-based font drawing library for X rc libxi6:i386 2:1.6.0-0ubuntu2.1 X11 Input extension library rc libxinerama1:i386 2:1.1.1-3ubuntu0.1 X11 Xinerama extension library rc libxml2:i386 2.7.8.dfsg-5.1ubuntu4.6 GNOME XML library rc libxmu6:i386 2:1.1.0-3 X11 miscellaneous utility library rc libxp6:i386 1:1.0.1-2ubuntu0.12.04.1 X Printing Extension (Xprint) client library rc libxpm4:i386 1:3.5.9-4 X11 pixmap library rc libxrandr2:i386 2:1.3.2-2ubuntu0.2 X11 RandR extension library rc libxrender1:i386 1:0.9.6-2ubuntu0.1 X Rendering Extension client library rc libxslt1.1:i386 1.1.26-8ubuntu1.3 XSLT 1.0 processing library - runtime library rc libxss1:i386 1:1.2.1-2 X11 Screen Saver extension library rc libxt6:i386 1:1.1.1-2ubuntu0.1 X11 toolkit intrinsics library rc libxtst6:i386 2:1.2.0-4ubuntu0.1 X11 Testing -- Record extension library rc libxv1:i386 2:1.0.6-2ubuntu0.1 X11 Video extension library rc libxxf86vm1:i386 1:1.1.1-2ubuntu0.1 X11 XFree86 video mode extension library rc odbcinst1debian2:i386 2.2.14p2-5ubuntu3 Support library for accessing odbc ini files rc skype-bin:i386 4.2.0.11-0ubuntu0.12.04.2 client for Skype VOIP and instant messaging service - binary files rc sni-qt:i386 0.2.5-0ubuntu3 indicator support for Qt rc wine-compholio:i386 1.7.4~ubuntu12.04.1 The Compholio Edition is a special build of the popular Wine software rc xaw3dg:i386 1.5+E-18.1ubuntu1 Xaw3d widget set

    Read the article

  • How to check if the tab page is dirty and prompt the user to save before navigating away using ajaxtoolkit tab control in ASP.NET

    Step 1: Put a hidden variable in Update panel <asp:HiddenField ID="hfIsDirty" runat="server" Value="0" /> Step 2: Put the following code in ajaxcontrol tool kit tabcontainer OnClientActiveTabChanged="ActiveTabChanged" Copy the following script in the aspx page. <script type="text/javascript">       //Trigger Server side post back for the Tab container       function ActiveTabChanged(sender, e) {           __doPostBack('<%= tcBaseline.ClientID %>', '');       }       //Sets the dirty flag if the page is dirty       function setDirty() {           var hf = document.getElementById("<%=hfIsDirty.ClientID%>");           if (hf != null)               hf.value = 1;       }       //Resets the dirty flag after save       function clearDirty() {           var hf = document.getElementById("<%=hfIsDirty.ClientID%>");           hf.value = 0;       }       function showMessage() { return "page is dirty" }       function setControlChange() {           if (typeof (event.srcElement) != 'undefined')           { event.srcElement.onchange = setDirty; }       }       function checkDirty() {           var tc = document.getElementById("<%=tcBaseline.ClientID%>");           var hf = document.getElementById("<%=hfIsDirty.ClientID%>");           if (hf.value == "1") {               var conf = confirm("Do you want o loose unsaved changes? Please Cancel to stay on page or OK to continue ");               if (conf) {                   clearDirty();                   return true;               }               else {                   var e = window.event;                   e.cancelBubble = true;                   if (e.stopPropagation) e.stopPropagation();                   return false;               }           }           else               return true;       }       document.body.onclick = setControlChange;       document.body.onkeyup = setControlChange;       var onBeforeUnloadFired = false;       // Function to reset the above flag.       function resetOnBeforeUnloadFired() {           onBeforeUnloadFired = false;       }       function doBeforeUnload() {           var hf = document.getElementById("<%=hfIsDirty.ClientID%>");           // If this function has not been run before...           if (!onBeforeUnloadFired) {               // Prevent this function from being run twice in succession.               onBeforeUnloadFired = true;               // If the form is dirty...               if (hf.value == "1") {                   event.returnValue = "If you continue you will lose any changes that you have made to this record.";               }           }           window.setTimeout("resetOnBeforeUnloadFired()", 1000);       }       if (window.body) {           window.body.onbeforeunload = doBeforeUnload;       }       else           window.onbeforeunload = doBeforeUnload;   </script> Step 3: Here is how the tabcontrol should look like <asp:UpdatePanel ID="upTab" runat="server" UpdateMode="conditional">                     <ContentTemplate>                         <ajaxtoolkit:TabContainer ID="tcBaseline" runat="server" Height="400px" OnClientActiveTabChanged="ActiveTabChanged">                             <ajaxtoolkit:TabPanel ID="tpPersonalInformation" runat="server">                                 <HeaderTemplate>                                     <asp:Label ID="lblPITab" runat="server" Text="<%$ Resources:Resources, Baseline_Tab_PersonalInformation %>"                                         onclick="checkDirty();"></asp:Label>                                 </HeaderTemplate>                                 <ContentTemplate>                                     <asp:PlaceHolder ID="PlaceHolder1" runat="server"></asp:PlaceHolder> </ContentTemplate>                             </ajaxtoolkit:TabPanel> span.fullpost {display:none;}

    Read the article

  • To ref or not to ref

    - by nmarun
    So the question is what is the point of passing a reference type along with the ref keyword? I have an Employee class as below: 1: public class Employee 2: { 3: public string FirstName { get; set; } 4: public string LastName { get; set; } 5:  6: public override string ToString() 7: { 8: return string.Format("{0}-{1}", FirstName, LastName); 9: } 10: } In my calling class, I say: 1: class Program 2: { 3: static void Main() 4: { 5: Employee employee = new Employee 6: { 7: FirstName = "John", 8: LastName = "Doe" 9: }; 10: Console.WriteLine(employee); 11: CallSomeMethod(employee); 12: Console.WriteLine(employee); 13: } 14:  15: private static void CallSomeMethod(Employee employee) 16: { 17: employee.FirstName = "Smith"; 18: employee.LastName = "Doe"; 19: } 20: }   After having a look at the code, you’ll probably say, Well, an instance of a class gets passed as a reference, so any changes to the instance inside the CallSomeMethod, actually modifies the original object. Hence the output will be ‘John-Doe’ on the first call and ‘Smith-Doe’ on the second. And you’re right: So the question is what’s the use of passing this Employee parameter as a ref? 1: class Program 2: { 3: static void Main() 4: { 5: Employee employee = new Employee 6: { 7: FirstName = "John", 8: LastName = "Doe" 9: }; 10: Console.WriteLine(employee); 11: CallSomeMethod(ref employee); 12: Console.WriteLine(employee); 13: } 14:  15: private static void CallSomeMethod(ref Employee employee) 16: { 17: employee.FirstName = "Smith"; 18: employee.LastName = "Doe"; 19: } 20: } The output is still the same: Ok, so is there really a need to pass a reference type using the ref keyword? I’ll remove the ‘ref’ keyword and make one more change to the CallSomeMethod method. 1: class Program 2: { 3: static void Main() 4: { 5: Employee employee = new Employee 6: { 7: FirstName = "John", 8: LastName = "Doe" 9: }; 10: Console.WriteLine(employee); 11: CallSomeMethod(employee); 12: Console.WriteLine(employee); 13: } 14:  15: private static void CallSomeMethod(Employee employee) 16: { 17: employee = new Employee 18: { 19: FirstName = "Smith", 20: LastName = "John" 21: }; 22: } 23: } In line 17 you’ll see I’ve ‘new’d up the incoming Employee parameter and then set its properties to new values. The output tells me that the original instance of the Employee class does not change. Huh? But an instance of a class gets passed by reference, so why did the values not change on the original instance or how do I keep the two instances in-sync all the times? Aah, now here’s the answer. In order to keep the objects in sync, you pass them using the ‘ref’ keyword. 1: class Program 2: { 3: static void Main() 4: { 5: Employee employee = new Employee 6: { 7: FirstName = "John", 8: LastName = "Doe" 9: }; 10: Console.WriteLine(employee); 11: CallSomeMethod(ref employee); 12: Console.WriteLine(employee); 13: } 14:  15: private static void CallSomeMethod(ref Employee employee) 16: { 17: employee = new Employee 18: { 19: FirstName = "Smith", 20: LastName = "John" 21: }; 22: } 23: } Viola! Now, to prove it beyond doubt, I said, let me try with another reference type: string. 1: class Program 2: { 3: static void Main() 4: { 5: string name = "abc"; 6: Console.WriteLine(name); 7: CallSomeMethod(ref name); 8: Console.WriteLine(name); 9: } 10:  11: private static void CallSomeMethod(ref string name) 12: { 13: name = "def"; 14: } 15: } The output was as expected, first ‘abc’ and then ‘def’ - proves the 'ref' keyword works here as well. Now, what if I remove the ‘ref’ keyword? The output should still be the same as the above right, since string is a reference type? 1: class Program 2: { 3: static void Main() 4: { 5: string name = "abc"; 6: Console.WriteLine(name); 7: CallSomeMethod(name); 8: Console.WriteLine(name); 9: } 10:  11: private static void CallSomeMethod(string name) 12: { 13: name = "def"; 14: } 15: } Wrong, the output shows ‘abc’ printed twice. Wait a minute… now how could this be? This is because string is an immutable type. This means that any time you modify an instance of string, new memory address is allocated to the instance. The effect is similar to ‘new’ing up the Employee instance inside the CallSomeMethod in the absence of the ‘ref’ keyword. Verdict: ref key came to the rescue and saved the planet… again!

    Read the article

  • Routing to a Controller with no View in Angular

    - by Rick Strahl
    I've finally had some time to put Angular to use this week in a small project I'm working on for fun. Angular's routing is great and makes it real easy to map URL routes to controllers and model data into views. But what if you don't actually need a view, if you effectively need a headless controller that just runs code, but doesn't render a view?Preserve the ViewWhen Angular navigates a route and and presents a new view, it loads the controller and then renders the view from scratch. Views are not cached or stored, but displayed and then removed. So if you have routes configured like this:'use strict'; // Declare app level module which depends on filters, and services window.myApp = angular.module('myApp', ['myApp.filters', 'myApp.services', 'myApp.directives', 'myApp.controllers']). config(['$routeProvider', function($routeProvider) { $routeProvider.when('/map', { template: "partials/map.html ", controller: 'mapController', reloadOnSearch: false, animation: 'slide' }); … $routeProvider.otherwise({redirectTo: '/map'}); }]); Angular routes to the mapController and then re-renders the map.html template with the new data from the $scope filled in.But, but… I don't want a new View!Now in most cases this works just fine. If I'm rendering plain DOM content, or textboxes in a form interface that is all fine and dandy - it's perfectly fine to completely re-render the UI.But in some cases, the UI that's being managed has state and shouldn't be redrawn. In this case the main page in question has a Google Map on it. The map is  going to be manipulated throughout the lifetime of the application and the rest of the pages. In my application I have a toolbar on the bottom and the rest of the content is replaced/switched out by the Angular Views:The problem is that the map shouldn't be redrawn each time the Location view is activated. It should maintain its state, such as the current position selected (which can move), and shouldn't redraw due to the overhead of re-rendering the initial map.Originally I set up the map, exactly like all my other views - as a partial, that is rendered with a separate file, but that didn't work.The Workaround - Controller Only RoutesThe workaround for this goes decidedly against Angular's way of doing things:Setting up a Template-less RouteIn-lining the map view directly into the main pageHiding and showing the map view manuallyLet's see how this works.Controller Only RouteThe template-less route is basically a route that doesn't have any template to render. This is not directly supported by Angular, but thankfully easy to fake. The end goal here is that I want to simply have the Controller fire and then have the controller manage the display of the already active view by hiding and showing the map and any other view content, in effect bypassing Angular's view display management.In short - I want a controller action, but no view rendering.The controller-only or template-less route looks like this: $routeProvider.when('/map', { template: " ", // just fire controller controller: 'mapController', animation: 'slide' });Notice I'm using the template property rather than templateUrl (used in the first example above), which allows specifying a string template, and leaving it blank. The template property basically allows you to provide a templated string using Angular's HandleBar like binding syntax which can be useful at times. You can use plain strings or strings with template code in the template, or as I'm doing here a blank string to essentially fake 'just clear the view'. In-lined ViewSo if there's no view where does the HTML go? Because I don't want Angular to manage the view the map markup is in-lined directly into the page. So instead of rendering the map into the Angular view container, the content is simply set up as inline HTML to display as a sibling to the view container.<div id="MapContent" data-icon="LocationIcon" ng-controller="mapController" style="display:none"> <div class="headerbar"> <div class="right-header" style="float:right"> <a id="btnShowSaveLocationDialog" class="iconbutton btn btn-sm" href="#/saveLocation" style="margin-right: 2px;"> <i class="icon-ok icon-2x" style="color: lightgreen; "></i> Save Location </a> </div> <div class="left-header">GeoCrumbs</div> </div> <div class="clearfix"></div> <div id="Message"> <i id="MessageIcon"></i> <span id="MessageText"></span> </div> <div id="Map" class="content-area"> </div> </div> <div id="ViewPlaceholder" ng-view></div>Note that there's the #MapContent element and the #ViewPlaceHolder. The #MapContent is my static map view that is always 'live' and is initially hidden. It is initially hidden and doesn't get made visible until the MapController controller activates it which does the initial rendering of the map. After that the element is persisted with the map data already loaded and any future access only updates the map with new locations/pins etc.Note that default route is assigned to the mapController, which means that the mapController is fired right as the page loads, which is actually a good thing in this case, as the map is the cornerstone of this app that is manipulated by some of the other controllers/views.The Controller handles some UISince there's effectively no view activation with the template-less route, the controller unfortunately has to take over some UI interaction directly. Specifically it has to swap the hidden state between the map and any of the other views.Here's what the controller looks like:myApp.controller('mapController', ["$scope", "$routeParams", "locationData", function($scope, $routeParams, locationData) { $scope.locationData = locationData.location; $scope.locationHistory = locationData.locationHistory; if ($routeParams.mode == "currentLocation") { bc.getCurrentLocation(false); } bc.showMap(false,"#LocationIcon"); }]);bc.showMap is responsible for a couple of display tasks that hide/show the views/map and for activating/deactivating icons. The code looks like this:this.showMap = function (hide,selActiveIcon) { if (!hide) $("#MapContent").show(); else { $("#MapContent").hide(); } self.fitContent(); if (selActiveIcon) { $(".iconbutton").removeClass("active"); $(selActiveIcon).addClass("active"); } };Each of the other controllers in the app also call this function when they are activated to basically hide the map and make the View Content area visible. The map controller makes the map.This is UI code and calling this sort of thing from controllers is generally not recommended, but I couldn't figure out a way using directives to make this work any more easily than this. It'd be easy to hide and show the map and view container using a flag an ng-show, but it gets tricky because of scoping of the $scope. I would have to resort to storing this setting on the $rootscope which I try to avoid. The same issues exists with the icons.It sure would be nice if Angular had a way to explicitly specify that a View shouldn't be destroyed when another view is activated, so currently this workaround is required. Searching around, I saw a number of whacky hacks to get around this, but this solution I'm using here seems much easier than any of that I could dig up even if it doesn't quite fit the 'Angular way'.Angular nice, until it's notOverall I really like Angular and the way it works although it took me a bit of time to get my head around how all the pieces fit together. Once I got the idea how the app/routes, the controllers and views snap together, putting together Angular pages becomes fairly straightforward. You can get quite a bit done never going beyond those basics. For most common things Angular's default routing and view presentation works very well.But, when you do something a bit more complex, where there are multiple dependencies or as in this case where Angular doesn't appear to support a feature that's absolutely necessary, you're on your own. Finding information on more advanced topics is not trivial especially since versions are changing so rapidly and the low level behaviors are changing frequently so finding something that works is often an exercise in trial and error. Not that this is surprising. Angular is a complex piece of kit as are all the frameworks that try to hack JavaScript into submission to do something that it was really never designed to. After all everything about a framework like Angular is an elaborate hack. A lot of shit has to happen to make this all work together and at that Angular (and Ember, Durandel etc.) are pretty amazing pieces of JavaScript code. So no harm, no foul, but I just can't help feeling like working in toy sandbox at times :-)© Rick Strahl, West Wind Technologies, 2005-2013Posted in Angular  JavaScript   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Migrating SQL Server Databases – The DBA’s Checklist (Part 2)

    - by Sadequl Hussain
    Continuing from Part 1  , our Migration Checklist continues: Step 5: Update statistics It is always a good idea to update the statistics of the database that you have just installed or migrated. To do this, run the following command against the target database: sp_updatestats The sp_updatestats system stored procedure runs the UPDATE STATISTICS command against every user and system table in the database.  However, a word of caution: running the sp_updatestats against a database with a compatibility level below 90 (SQL Server 2005) will reset the automatic UPDATE STATISTICS settings for every index and statistics of every table in the database. You may therefore want to change the compatibility mode before you run the command. Another thing you should remember to do is to ensure the new database has its AUTO_CREATE_STATISTICS and AUTO_UPDATE_STATISTICS properties set to ON. You can do so using the ALTER DATABASE command or from the SSMS. Step 6: Set database options You may have to change the state of a database after it has been restored. If the database was changed to single-user or read-only mode before backup, the restored copy will also retain these settings. This may not be an issue when you are manually restoring from Enterprise Manager or the Management Studio since you can change the properties. However, this is something to be mindful of if the restore process is invoked by an automated job or script and the database needs to be written to immediately after restore. You may want to check the database’s status programmatically in such cases. Another important option you may want to set for the newly restored / attached database is PAGE_VERIFY. This option specifies how you want SQL Server to ensure the physical integrity of the data. It is a new option from SQL Server 2005 and can have three values: CHECKSUM (default for SQL Server 2005 and latter databases), TORN_PAGE_DETECTION (default when restoring a pre-SQL Server 2005 database) or NONE. Torn page detection was itself an option for SQL Server 2000 databases. From SQL Server 2005, when PAGE_VERIFY is set to CHECKSUM, the database engine calculates the checksum for a page’s contents and writes it to the page header before storing it in disk. When the page is read from the disk, the checksum is computed again and compared with the checksum stored in the header.  Torn page detection works much like the same way in that it stores a bit in the page header for every 512 byte sector. When data is read from the page, the torn page bits stored in the header is compared with the respective sector contents. When PAGE_VERIFY is set to NONE, SQL Server does not perform any checking, even if torn page data or checksums are present in the page header.  This may not be something you would want to set unless there is a very specific reason.  Microsoft suggests using the CHECKSUM page verify option as this offers more protection. Step 7: Map database users to logins A common database migration issue is related to user access. Windows and SQL Server native logins that existed in the source instance and had access to the database may not be present in the destination. Even if the logins exist in the destination, the mapping between the user accounts and the logins will not be automatic. You can use a special system stored procedure called sp_change_users_login to address these situations. The procedure needs to be run against the newly attached or restored database and can accept four parameters. Depending on what you want to do, you may be using less than four though. The first parameter, @Action, can take three values. When you specify @Action = ‘Report’, the system will provide you with a list of database users which are not mapped to any login. If you want to map a database user to an existing SQL Server login, the value for @Action will be ‘Update_One’. In this case, you will only need to provide the database user name and the login it will map to. So if your newly restored database has a user account called “bob” and there is already a SQL Server login with the same name and you want to map the user to the login, you will execute a query like the following: sp_change_users_login         @Action = ‘Update_One’,         @UserNamePattern = ‘bob’,         @LoginName = ‘bob’ If the login does not exist, you can instruct SQL Server to create the login with the same name. In this case you will need to provide a password for the login and the value of the @Action parameter will be ‘Auto_Fix’. If the login already exists, it will be automatically mapped to the user account. Unfortunately sp_change_users_login system stored procedure cannot be used to map database users to trusted logins (Windows accounts) in SQL Server. You will need to follow a manual process to re-map the database user accounts.  Continues…

    Read the article

  • SSH error: Permission denied, please try again

    - by Kamal
    I am new to ubuntu. Hence please forgive me if the question is too simple. I have a ubuntu server setup using amazon ec2 instance. I need to connect my desktop (which is also a ubuntu machine) to the ubuntu server using SSH. I have installed open-ssh in ubuntu server. I need all systems of my network to connect the ubuntu server using SSH (no need to connect through pem or pub keys). Hence opened SSH port 22 for my static IP in security groups (AWS). My SSHD-CONFIG file is: # Package generated configuration file # See the sshd_config(5) manpage for details # What ports, IPs and protocols we listen for Port 22 # Use these options to restrict which interfaces/protocols sshd will bind to #ListenAddress :: #ListenAddress 0.0.0.0 Protocol 2 # HostKeys for protocol version 2 HostKey /etc/ssh/ssh_host_rsa_key HostKey /etc/ssh/ssh_host_dsa_key HostKey /etc/ssh/ssh_host_ecdsa_key #Privilege Separation is turned on for security UsePrivilegeSeparation yes # Lifetime and size of ephemeral version 1 server key KeyRegenerationInterval 3600 ServerKeyBits 768 # Logging SyslogFacility AUTH LogLevel INFO # Authentication: LoginGraceTime 120 PermitRootLogin yes StrictModes yes RSAAuthentication yes PubkeyAuthentication yes #AuthorizedKeysFile %h/.ssh/authorized_keys # Don't read the user's ~/.rhosts and ~/.shosts files IgnoreRhosts yes # For this to work you will also need host keys in /etc/ssh_known_hosts RhostsRSAAuthentication no # similar for protocol version 2 HostbasedAuthentication no # Uncomment if you don't trust ~/.ssh/known_hosts for RhostsRSAAuthentication #IgnoreUserKnownHosts yes # To enable empty passwords, change to yes (NOT RECOMMENDED) PermitEmptyPasswords no # Change to yes to enable challenge-response passwords (beware issues with # some PAM modules and threads) ChallengeResponseAuthentication no # Change to no to disable tunnelled clear text passwords #PasswordAuthentication yes # Kerberos options #KerberosAuthentication no #KerberosGetAFSToken no #KerberosOrLocalPasswd yes #KerberosTicketCleanup yes # GSSAPI options #GSSAPIAuthentication no #GSSAPICleanupCredentials yes X11Forwarding yes X11DisplayOffset 10 PrintMotd no PrintLastLog yes TCPKeepAlive yes #UseLogin no #MaxStartups 10:30:60 #Banner /etc/issue.net # Allow client to pass locale environment variables AcceptEnv LANG LC_* Subsystem sftp /usr/lib/openssh/sftp-server # Set this to 'yes' to enable PAM authentication, account processing, # and session processing. If this is enabled, PAM authentication will # be allowed through the ChallengeResponseAuthentication and # PasswordAuthentication. Depending on your PAM configuration, # PAM authentication via ChallengeResponseAuthentication may bypass # the setting of "PermitRootLogin without-password". # If you just want the PAM account and session checks to run without # PAM authentication, then enable this but set PasswordAuthentication # and ChallengeResponseAuthentication to 'no'. UsePAM yes Through webmin (Command shell), I have created a new user named 'senthil' and added this new user to 'sudo' group. sudo adduser -y senthil sudo adduser senthil sudo I tried to login using this new user 'senthil' in 'webmin'. I was able to login successfully. When I tried to connect ubuntu server from my terminal through SSH, ssh senthil@SERVER_IP It asked me to enter password. After the password entry, it displayed: Permission denied, please try again. On some research I realized that, I need to monitor my server's auth log for this. I got the following error in my auth log (/var/log/auth.log) Jul 2 09:38:07 ip-192-xx-xx-xxx sshd[3037]: pam_unix(sshd:auth): authentication failure; logname= uid=0 euid=0 tty=ssh ruser= rhost=MY_CLIENT_IP user=senthil Jul 2 09:38:09 ip-192-xx-xx-xxx sshd[3037]: Failed password for senthil from MY_CLIENT_IP port 39116 ssh2 When I tried to debug using: ssh -v senthil@SERVER_IP OpenSSH_5.9p1 Debian-5ubuntu1, OpenSSL 1.0.1 14 Mar 2012 debug1: Reading configuration data /etc/ssh/ssh_config debug1: /etc/ssh/ssh_config line 19: Applying options for * debug1: Connecting to SERVER_IP [SERVER_IP] port 22. debug1: Connection established. debug1: identity file {MY-WORKSPACE}/.ssh/id_rsa type 1 debug1: Checking blacklist file /usr/share/ssh/blacklist.RSA-2048 debug1: Checking blacklist file /etc/ssh/blacklist.RSA-2048 debug1: identity file {MY-WORKSPACE}/.ssh/id_rsa-cert type -1 debug1: identity file {MY-WORKSPACE}/.ssh/id_dsa type -1 debug1: identity file {MY-WORKSPACE}/.ssh/id_dsa-cert type -1 debug1: identity file {MY-WORKSPACE}/.ssh/id_ecdsa type -1 debug1: identity file {MY-WORKSPACE}/.ssh/id_ecdsa-cert type -1 debug1: Remote protocol version 2.0, remote software version OpenSSH_5.8p1 Debian-7ubuntu1 debug1: match: OpenSSH_5.8p1 Debian-7ubuntu1 pat OpenSSH* debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_5.9p1 Debian-5ubuntu1 debug1: SSH2_MSG_KEXINIT sent debug1: SSH2_MSG_KEXINIT received debug1: kex: server->client aes128-ctr hmac-md5 none debug1: kex: client->server aes128-ctr hmac-md5 none debug1: sending SSH2_MSG_KEX_ECDH_INIT debug1: expecting SSH2_MSG_KEX_ECDH_REPLY debug1: Server host key: ECDSA {SERVER_HOST_KEY} debug1: Host 'SERVER_IP' is known and matches the ECDSA host key. debug1: Found key in {MY-WORKSPACE}/.ssh/known_hosts:1 debug1: ssh_ecdsa_verify: signature correct debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug1: SSH2_MSG_NEWKEYS received debug1: Roaming not allowed by server debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: password debug1: Next authentication method: password senthil@SERVER_IP's password: debug1: Authentications that can continue: password Permission denied, please try again. senthil@SERVER_IP's password: For password, I have entered the same value which I normally use for 'ubuntu' user. Can anyone please guide me where the issue is and suggest some solution for this issue?

    Read the article

  • SSH service will not start on fresh Cygwin 1.7.15 install

    - by Coder6841
    OS: Windows 7 x64 Cygwin: 1.7.15-1 OpenSSH: 6.0p1-1 I'm attempting to install an SSH server on Windows 7. The tutorial that I'm following to do this is here: http://www.howtogeek.com/howto/41560/how-to-get-ssh-command-line-access-to-windows-7-using-cygwin/ The issue is that upon executing the net start sshd command I get the following output:The CYGWIN sshd service is starting. The CYGWIN sshd service could not be started. The service did not report an error. More help is available by typing NET HELPMSG 3534. Here is the full output of the setup: AdminUser@ThisComputer ~ $ ssh-host-config *** Info: Generating /etc/ssh_host_key *** Info: Generating /etc/ssh_host_rsa_key *** Info: Generating /etc/ssh_host_dsa_key *** Info: Generating /etc/ssh_host_ecdsa_key *** Info: Creating default /etc/ssh_config file *** Info: Creating default /etc/sshd_config file *** Info: Privilege separation is set to yes by default since OpenSSH 3.3. *** Info: However, this requires a non-privileged account called 'sshd'. *** Info: For more info on privilege separation read /usr/share/doc/openssh/README.privsep. *** Query: Should privilege separation be used? (yes/no) yes *** Info: Note that creating a new user requires that the current account have *** Info: Administrator privileges. Should this script attempt to create a *** Query: new local account 'sshd'? (yes/no) yes *** Info: Updating /etc/sshd_config file *** Query: Do you want to install sshd as a service? *** Query: (Say "no" if it is already installed as a service) (yes/no) yes *** Query: Enter the value of CYGWIN for the daemon: [] *** Info: On Windows Server 2003, Windows Vista, and above, the *** Info: SYSTEM account cannot setuid to other users -- a capability *** Info: sshd requires. You need to have or to create a privileged *** Info: account. This script will help you do so. *** Info: You appear to be running Windows XP 64bit, Windows 2003 Server, *** Info: or later. On these systems, it's not possible to use the LocalSystem *** Info: account for services that can change the user id without an *** Info: explicit password (such as passwordless logins [e.g. public key *** Info: authentication] via sshd). *** Info: If you want to enable that functionality, it's required to create *** Info: a new account with special privileges (unless a similar account *** Info: already exists). This account is then used to run these special *** Info: servers. *** Info: Note that creating a new user requires that the current account *** Info: have Administrator privileges itself. *** Info: No privileged account could be found. *** Info: This script plans to use 'cyg_server'. *** Info: 'cyg_server' will only be used by registered services. *** Query: Do you want to use a different name? (yes/no) no *** Query: Create new privileged user account 'cyg_server'? (yes/no) yes *** Info: Please enter a password for new user cyg_server. Please be sure *** Info: that this password matches the password rules given on your system. *** Info: Entering no password will exit the configuration. *** Query: Please enter the password: *** Query: Reenter: *** Info: User 'cyg_server' has been created with password '[CENSORED]'. *** Info: If you change the password, please remember also to change the *** Info: password for the installed services which use (or will soon use) *** Info: the 'cyg_server' account. *** Info: Also keep in mind that the user 'cyg_server' needs read permissions *** Info: on all users' relevant files for the services running as 'cyg_server'. *** Info: In particular, for the sshd server all users' .ssh/authorized_keys *** Info: files must have appropriate permissions to allow public key *** Info: authentication. (Re-)running ssh-user-config for each user will set *** Info: these permissions correctly. [Similar restrictions apply, for *** Info: instance, for .rhosts files if the rshd server is running, etc]. *** Info: The sshd service has been installed under the 'cyg_server' *** Info: account. To start the service now, call `net start sshd' or *** Info: `cygrunsrv -S sshd'. Otherwise, it will start automatically *** Info: after the next reboot. *** Info: Host configuration finished. Have fun! AdminUser@ThisComputer ~ $ net start sshd The CYGWIN sshd service is starting. The CYGWIN sshd service could not be started. The service did not report an error. More help is available by typing NET HELPMSG 3534. Note that on the line *** Query: Enter the value of CYGWIN for the daemon: [] I haven't entered anything. Tutorials often say to use ntsec or ntsec tty here but those options are removed from the latest version of OpenSSH. I've tried using them anyway and the result is the same. The file /var/log/sshd.log is empty. If I try just running the command /usr/sbin/sshd I get the output /var/empty must be owned by root and not group or world-writable.. The /var/empty directory has the following permissions: drwxr-xr-x+ 1 cyg_server root 0 May 29 15:28 empty. Google searches on this error did not turn up any working fixes. One person seems to have solved it by using the command chown SYSTEM /var/empty but that did not fix it in my case.

    Read the article

  • MySQL Connector/Net 6.5.5 Maintenance Release has been released

    - by fernando
    MySQL Connector/Net 6.5.5, a new maintenance release of our 6.5 series, has been released.  This release is GA quality and is appropriate for use in production environments.  Please note that 6.6 is our latest driver series and is the recommended product for development. It is now available in source and binary form from http://dev.mysql.com/downloads/connector/net/#downloads and mirror sites (note that not all mirror sites may be up to date at this point-if you can't find this version on some mirror, please try again later or choose another download site.) The 6.5.5 version of MySQL Connector/Net brings the following fixes: - Fix for ArgumentNull exception when using Take().Count() in a LINQ to Entities query (bug MySql #64749, Oracle bug #13913047). - Fix for type varchar changed to bit when saving in Table Designer (Oracle bug #13916560). - Fix for error when trying to change the name of an Index on the Indexes/Keys editor; along with this fix now users can change the Index type of a new Index which could not be done   in previous versions, and when changing the Index name the change is reflected on the list view at the left side of the Index/Keys editor (Oracle bug #13613801). - Fix for stored procedure call using only its name with EF code first (MySql bug #64999, Oracle bug #14008699). - Fix for List.Contains generates a bunch of ORs instead of more efficient IN clause in   LINQ to Entities (Oracle bug #14016344, MySql bug #64934). - Fix for performance issue in generated EF query: .NET StartsWith/Contains/EndsWith produces MySql's locate instead of Like (MySql bug #64935, Oracle bug #14009363). - Fix for script generated for code first contains wrong alter table and wrong declaration for byte[] (MySql bug #64216, Oracle bug #13900091). - Fix and code contribution for bug Timed out sessions are removed without notification which allow to enable the Expired CallBack when Session Provider times out any session (bug MySql #62266 Oracle bug # 13354935) - Fix for Exception thrown when using cascade delete in an EDM Model-First in Entity Framework (Oracle bug #14008752, MySql bug #64779). - Fix for Session locking issue with MySqlSessionStateStore (MySql bug #63997, Oracble bug #13733054). - Fixed deleting a user profile using Profile provider (MySQL bug #64470, Oracle bug #13790123) - Fix for bug Cannot Create an Entity with a Key of Type String (MySQL bug #65289, Oracle bug #14540202). This fix checks if the type has a FixedLength facet set in order to create a char otherwise should create varchar, mediumtext or longtext types when using a String CLR type in Code First or Model First also tested in Database First. Unit tests added for Code First and ProviderManifest. - Fix for bug "CacheServerProperties can cause 'Packet too large' error". The issue was due to a missing reading of Max_allowed_packet server property when CacheServerProperties is in true, since the value was read only in the first connection but the following pooled connections had a wrong value causing a Packet too large error. Including also a unit test for this scenario. All unit test passed. MySQL Bug #66578 Orabug #14593547. - Fix for handling unnamed parameter in MySQLCommand. This fix allows the mysqlcommand to handle parameters without requiring naming (e.g. INSERT INTO Test (id,name) VALUES (?, ?) ) (MySQL Bug #66060, Oracle bug #14499549). - Fixed inheritance on Entity Framework Code First scenarios. Discriminator column is created using its correct type as varchar(128) (MySql bug #63920 and Oracle bug #13582335). - Fixed "Trying to customize column precision in Code First does not work" (MySql bug #65001, Oracle bug #14469048). - Fixed bug ASP.NET Membership database fails on MySql database UTF32 (MySQL bug #65144, Oracle bug #14495292). - Fix for MySqlCommand.LastInsertedId holding only 32 bit values (MySql bug #65452, Oracle bug #14171960) by changing   several internal declaration of lastinsertid from int to long. - Fixed "Decimal type should have digits at right of decimal point", now default is 2, but user's changes in   EDM designer are recognized (MySql bug #65127, Oracle bug #14474342). - Fix for NullReferenceException when saving an uninitialized row in Entity Framework (MySql bug #66066, Oracle bug #14479715). - Fix for error when calling RoleProvider.RemoveUserFromRole(): causes an exception due to a wrong table being used (MySql bug #65805, Oracle bug #14405338). - Fix for "Memory Leak on MySql.Data.MySqlClient.MySqlCommand", too many MemoryStream's instances created (MySql bug #65696, Oracle bug #14468204). - Added ANTLR attribution notice (Oracle bug #14379162). - Fixed Entity Framework + mysql connector/net in partial trust throws exceptions (MySql bug #65036, Oracle bug #14668820). - Added support in Parser for Datetime and Time types with precision when using Server 5.6 (No bug Number). - Small improvement on MySqlPoolManager CleanIdleConnections for better mysqlpoolmanager idlecleanuptimer at startup (MySql bug #66472 and Oracle bug #14652624). - Fix for bug TIMESTAMP values are mistakenly represented as DateTime with Kind = Local (Mysql bug #66964, Oracle bug #14740705). - Fix for bug Keyword not supported. Parameter name: AttachDbFilename (Mysql bug #66880, Oracle bug #14733472). - Added support to MySql script file to retrieve data when using "SHOW" statements. - Fix for Package Load Failure in Visual Studio 2005 (MySql bug #63073, Oracle bug #13491674). - Fix for bug "Unable to connect using IPv6 connections" (MySQL bug #67253, Oracle bug #14835718). - Added auto-generated values for Guid identity columns (MySql bug #67450, Oracle bug #15834176). - Fix for method FirstOrDefault not supported in some LINQ to Entities queries (MySql bug #67377, Oracle bug #15856964). The release is available to download at http://dev.mysql.com/downloads/connector/net/6.5.html Documentation ------------------------------------- You can view current Connector/Net documentation at http://dev.mysql.com/doc/refman/5.5/en/connector-net.html You can find our team blog at http://blogs.oracle.com/MySQLOnWindows. You can also post questions on our forums at http://forums.mysql.com/. Enjoy and thanks for the support! 

    Read the article

  • CodePlex Daily Summary for Sunday, December 26, 2010

    CodePlex Daily Summary for Sunday, December 26, 2010Popular ReleasesNoSimplerAccounting: NoSimplerAccounting 6.0: -Fixed a bug in expense category report.Temporary Data Storage Folder: TDS Folder source code: This is latest version 0.2 Beta code. To know the password request at neutron.request@gmail.comNHibernate Mapping Generator: NHibernate Mapping Generator 2.0: Added support for Postgres (Thanks to Angelo)NewLife XCode: XCode v6.5.2010.1223 ????(????v3.5??): XCode v6.5.2010.1223 ????,??: NewLife.Core ??? NewLife.Net ??? XControl ??? XTemplate ????,??C#?????? XAgent ???? NewLife.CommonEnitty ??????(???,XCode??????) XCode?? ?????????,??????????????????,?????95% XCode v3.5.2009.0714 ??,?v3.5?v6.0???????????????,?????????。v3.5???????????,??????????????。 XCoder ??XTemplate?????????,????????XCode??? XCoder_Src ???????(????XTemplate????),??????????????????Windows Workflow Foundation on Codeplex: Microsoft.Activities.UnitTesting 1.71: New in this release Episode as Tasks using the Task Parallel Library CHM Help file now included in release. StyleCop compliance is resulting in massive refatoring but should not cause breaking changes Breaking Change DefaultTimeout is now a TimeSpan Removed default parameters using int timeout = DefaultTimeout and added overloads insteadMiniTwitter: 1.64: MiniTwitter 1.64 ???? ?? 1.63 ??? URL ??????????????Ajax ASP.Net Forum: InSeCla Forum Software v0.1.9: *VERSION: 0.1.9* HAPPY CHRISTMAS FEATURES ADDED Added features customizabled per category level (Customize at ADMIN/Categories Tab) Allow Anonymous Threads, Allow Anonymous Post Virtual URLs (friendly urls) has finally added And you can have some forum (category) using virtual urls and other using normal urls. Check !, as this improve the SEO indexing results Moderation Instant On: Delete Thread, Move Thread Available to users being members of moderators or administrators InstantO...VivoSocial: VivoSocial 7.4.0: Please see changes: http://support.vivoware.com/project/ChangeLog.aspx?PROJID=48Umbraco CMS: Umbraco 4.6 Beta - codename JUNO: The Umbraco 4.6 beta (codename JUNO) release contains many new features focusing on an improved installation experience, a number of robust developer features, and contains more than 89 bug fixes since the 4.5.2 release. Improved installer experience Updated Starter Kits (Simple, Blog, Personal, Business) Beautiful, free, customizable skins included Skinning engine and Skin customization (see Skinning Documentation Kit) Default dashboards on install with hide option Updated Login t...SSH.NET Library: 2010.12.23: This release includes some bug fixes and few new fetures. Fixes Allow to retrieve big directory structures ssh-dss algorithm is fixed Populate sftp file attributes New Features Support for passhrase when private key is used Support added for diffie-hellman-group14-sha1,diffie-hellman-group-exchange-sha256 and diffie-hellman-group-exchange-sha1 key exchange algorithms Allow to provide multiple key files for authentication Add support for "keyboard-interactive" authentication method...ASP.NET MVC SiteMap provider: MvcSiteMapProvider 2.3.0: Using NuGet?MvcSiteMapProvider is also listed in the NuGet feed. Learn more... Like the project? Consider a donation!Donate via PayPal via PayPal. Release notesThis will be the last release targeting ASP.NET MVC 2 and .NET 3.5. MvcSiteMapProvider 3.0.0 will be targeting ASP.NET MVC 3 and .NET 4 Web.config setting skipAssemblyScanOn has been deprecated in favor of excludeAssembliesForScan and includeAssembliesForScan ISiteMapNodeUrlResolver is now completely responsible for generating th...SuperSocket, an extensible socket application framework: SuperSocket 1.3 beta 2: Compared with SuperSocket 1.3 beta 1, the changes listed below have been done in SuperSocket 1.3 beta 2: added supports for .NET 3.5 replaced Logging Application Block of EntLib with Log4Net improved the code about logging fixed a bug in QuickStart sample project added IPv6 supportTibiaPinger: TibiaPinger v1.0: TibiaPinger v1.0Media Companion: Media Companion 3.400: Extract the entire archive to a folder which has user access rights, eg desktop, documents etc. A manual is included to get you startedMulticore Task Framework: MTF 1.0.1: Release 1.0.1 of Multicore Task Framework.SQL Monitor - tracking sql server activities: SQL Monitor 3.0 alpha 7: 1. added script save/load in user query window 2. fixed problem with connection dialog when choosing windows auth but still ask for user name 3. auto open user table when double click one table node 4. improved alert message, added log only methodEnhSim: EnhSim 2.2.6 ALPHA: 2.2.6 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Fixing up some r...ASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.4.3: Helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: Improvements for confirm, popup, popup form RenderView controller extension the user experience for crud in live demo has been substantially improved + added search all the features are shown in the live demoGanttPlanner: GanttPlanner V1.0: GanttPlanner V1.0 include GanttPlanner.dll and also a Demo application.N2 CMS: 2.1 release candidate 3: * Web platform installer support available N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) A bunch of bugs were fixed File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the templ...New ProjectsA.I. Semantic Net Tests: A test using "semantic net" artificial intelligence developed in C++.Awesome.ClientID: Awesome.ClientID is a HTTPModule which serializes the ClientID of server controls, and page/usercontrol properties to a Json to make working with JavaScript easier without the need for outputting ClientID's. It's a solution for clean ClientIDs for .NET 2.0/3.5BitTweet: BitTweet is a fast, simple and easy to use Twitter client with tweeting, reading and writing functions included. It is know for being the most secure and fast Twitter client out there! It is written in Visual Basic .NET 2008. ~Tweet in a bit!CarRental001: carCrudo: CRUDO - The MCG (Model-Controller-Generator) CGF (Code Generation Framework) Visit The Project HomePage: http://adityayadav.com/CRUDO_The_MCG_Model_Controller_Generator_CGF_Code_Generation_Framework.aspx Licenses: 1) GPL v2 2) Commercial (contact us for information)Effeks: FX trading sample application sing prismEnhanced WebPart: Enhanced Webpart is a webpart with enhanced properties editor. It allows you display properties of most common types, and provides easy way to add custom controls to display your own types in webpart properties editor. Enhanced Webpart fully supports localization.Esteffin's graphic: Esercitazioni grafica 2010farasun: Source code Repository for farasun.wordpress.comGCTF: Desafio .NET Realizado na FACISAImgshow Plugin for Windows Live Writer: By using Imgshow Query you are able to publish multimedia content such as Youtube, Facebook Video, PDF, and other Imgshow supported service.Instant Messenger Using WPF and WCF: instant messenger using wcf and wpfjMenu: Create a dotnetnuke module that cover all kind of jquery menu pluginsMcopy API: Copy all files and directores in windows shell. Support long path (less then 32000 chars) and network path (eg. \\server\share or \\127.0.0.1\share)MOSSWebServices: This project contains code for interacting with the 21 web services provided by MOSS. This is targeted towards WSS developers This currently contains code to interact with "UserGroups" web service. I will keep adding code to interact with other MOSS web services soon. Nest Hub - Twitter Client: Nest Hub is a free Twitter client for PC. It's developed in VB.NET.Neural Networks Library: Neural networks Library by SefnajParallelism: Parallelism- Unified Parallel & Concurrency Framework Visit the Project HomePage: http://adityayadav.com/Parallelism.aspx Licenses: 1) GPL v2 2) Commercial (contact us for information)PDF IRM Protector For SharePoint 2007 And 2010: The PDF IRM Protector controls the conversion of PDF documents to their encrypted, rights-managed format and the decryption of PDF documents from their rights-managed format back to their original format. The project targets both SharePoint 2007 and 2010. PHP Form Validator - Isis: Project Isis simplifies the process of form validation for PHP. Offering a robust flag and function system, allowing you to encode the forms with the correct validation data and be done.Recycle Me: Open Source Project is to help and educate people in regarding reuse and recycling of daily life items. This is a Social project so we need funds for research and surveys. Volunteers are most welcome to contribute and certifications for their participation are available. ThanksSencha ExtJS Import Library for Script#: Script# import library for Sencha Ext JS Javascript Framework Separating Vertices: Given a graph as a collection of edges and vertices, it identifies the separating vertices, biconnected components, and edges. SIG - SISTEMA DE GESTÃO INTERNA: Projeto feito para a empresa MODEMTELECOM, consultoria em telecomunicoções.Smart Card COM: Smart Card COM Library, useful to access smart cards from a Web page in Internet ExplorerSocial Hub: Social HubStock Research: Test project for stocke researchtaokeba: ????????????????????,???????????。Temporary Data Storage Folder: This software creates a folder that stores temporary data that you want to store for temporary tasks. Temporary tasks such as storing a file that is to be deleted after some time. It automatically deleted those files on exit. Learn more at www.neutronsoftwares.weebly.comTukUI Updater: A OpenSource Windows Updater tool for the World of Warcraft(R) addon TukUI Developed in .NET 3.5 Discusion thread are found at the official TukUI forums: http://www.tukui.org/v2/forums/topic.php?id=4982 WebUtil: This is a project for using all of features by the web.

    Read the article

  • Does HTML 5 &ldquo;Rich vs. Reach&rdquo; a False Choice?

    - by andrewbrust
    The competition between the Web and proprietary rich platforms, including Windows, Mac OS, iPhone/iPad, Adobe’s Flash/AIR and Microsoft’s Silverlight, is not new. But with the emergence of HTML 5 and imminent support for it in the next release of the major Web browsers, the battle is heating up. And with the announcements made Wednesday at Google's I/O conference, it's getting kicked up yet another notch. The impact of this platform battle on companies in the media and advertising world, and the developers who serve them, is significant. The most prominent question is whether video and rich media online will shift towards pure HTML and away from plug-ins like Flash and Silverlight. In fact, certain features in HTML 5 make it suitable for development for line of business applications as well, further threatening those plug-in technologies. So what's the deal? Is this real or hype? To answer that question, I've done my own research into HTML 5's features and talked to several media-focused, New York area developers to get their opinions. I present my findings to you in this post. Before bearing down into HTML 5 specifics and practitioners’ quotes, let's set the context. To understand what HTML 5 can do, take a look at this video of Sports Illustrated’s HTML 5 prototype. This should start to get you bought into the idea that HTML 5 could be a game-changer. Next, if you happen to have installed the beta version of Google's Chrome 5 browser, take a look at the page linked to below, and in that page, click on any of the game thumbnails to see what's possible, without a plug-in, in this brave new world. (Note, although the instructions for each game tell you to press the A key to start, press the Z key instead.). Here's the link: http://www.kesiev.com/akihabara As an adjunct to what's enabled by HTML 5, consider the various transforms that are part of CSS 3. If you're running Safari as your browser, the following link will showcase this live; if not, you'll see a bitmap that will give you an idea of what's possible: http://webkit.org/blog/386/3d-transforms Are you starting to get the picture (literally)? What has up until now required browser plug-ins and other patches to HTML, most typically Flash, will soon be renderable, natively, in all major browsers. Moreover, it's looking likely that developers will be able to deliver such content and experiences in these browsers using one base of markup and script code (using straight JavaScript and/or jQuery), without resorting to browser-specific code and workarounds. If you're skeptical of this, I wouldn't blame you, especially with respect to Microsoft's Internet Explorer. However, i can tell you with confidence that even Microsoft is dedicated to full-on HTML 5 support in version 9 of that browser, which is currently under development. So what’s new in HTML 5, specifically, that makes sites like this possible?  The specification documents go into deep detail, and there’s no sense in rehashing them here, but a summary is probably in order.   Here is a non-authoritative, but useful, list of the major new feature areas in HTML 5: 2D drawing capabilities and 3D transforms. 2D drawing instructions can be embedded statically into a Web page; application interactivity and animation can be achieved through script.  As mentioned above, 3D transforms are technically part of version 3 of the CSS (Cascading Style Sheets) spec, rather than HTML 5, but they can nonetheless be thought of as part of the bundle.  They allow for rendering of 3D images and animations that, together with 2D drawing, make HTML-based games much more feasible than they are presently, as the links above demonstrate. Embedded audio and video. A media player can appear directly in a rendered Web page, using HTML markup and no plug-ins. Alternately, player controls can be hidden and the content can play automatically. Major enhancements to form-based input. This includes such things as specification of required fields, embedding of text “hints” into a control, limiting valid input on a field to dates, email addresses or a list of values.  There’s more to this, but the gist is that line-of-business applications, with complicated input and data validation, are supported directly Offline caching, local storage and client-side SQL database. These facilities allow Web applications to function more like native apps, even if no internet connection is available. User-defined data. Data (or metadata – data about data) can easily be embedded statically and/or retrieved and updated with Javascript code. This avoids having to embed that data in a separate file, or within script code. Taken together, these features position HTML to compete with, and perhaps overtake, Adobe’s Flash/AIR (and Microsoft’s Silverlight) as a viable Web platform for media, RIAs (rich internet applications – apps that function more like desktop software than Web sites) and interactive Web content, including games. What do players in the media world think about this?  From the embedded video above, we know what Sports Illustrated (and, therefore, Time Warner) think.  Hulu, the major Internet site for broadcast TV content, is on record as saying HTML5 video does not pass muster with them, at least not yet.  YouTube, on the other hand, already has an experimental HTML 5-based version of their site.  TechCrunch has reported that NetFlix is flirting with HTML 5 too, especially as it pertains to embedded browsers in TV-based devices.  And the New York Times’ Web site now embeds some video clips without resorting to Flash.  They have to – otherwise iPhone, iPod Touch and iPad users couldn’t see them in the Mobile Safari browser. What do media-focused developers think about all this?  I talked to several to get their opinions. Michael Pinto is CEO and Founder of Very Memorable Design whose primary focus has been to help marketing directors get traction online.  The firm’s client roster includes the likes Time, Inc., Scholastic and PBS.  Pinto predicts that “More and more microsites that were done entirely in Flash will be done more and more using jQuery. I can also see slideshows and video now being done without Flash. However if you needed to create a game or highly interactive activity Flash would still be the way to go for the web.” A dissenting view comes from Jesse Erlbaum, CEO of The Erlbaum Group, LLC, which serves numerous clients in the magazine publishing sector.  When I asked Erlbaum whether he thought HTML 5 and jQuery/JavaScript would steal significant market share from Flash, he responded “Not at all!  In particular, not for media and advertising customers!  These sectors are not generally in the business of making highly functional applications, which is the one place where HTML5/jQuery/etc really shines.” Ironically, Pinto’s firm is a heavy user of Flash for its projects and Erlbaum’s develops atop the “LAMP” (Linux, Apache, MySQL and PHP/Perl) stack.  For whatever reason, each firm seems to see the other’s toolset as a more viable choice.  But both agree that the developer tool story around HTML 5 is deficient.  Pinto explains “What’s lost with [HTML 5 and Javascript] techniques is that there isn’t a single widely favored easy-to-use tool of choice for authoring. So with Flash you can get up and running right away and not worry about what is different from one browser to the next.“  Erlbaum agrees, saying: “HTML5/Javascript lacks a sophisticated integrated development environment (IDE) which is an essential part of Flash.  If what someone is trying to make is primarily animation, it's a waste of time…to do this in Javascript.  It can be done much more easily in Flash, and with greater cross-browser compatibility and consistency due to the ubiquity of Flash.” Adobe (maker of Flash since its 2005 acquisition of Macromedia) likely agrees.  And for better or worse, they’ve decided to address this shortcoming of HTML 5, even at risk of diminishing their Flash platfrom. Yesterday Adobe announced that their hugely popular Deamweaver Web design authoring tool would directly support HTML 5 and CSS 3 development.  In fact, the Adobe Dreamweaver CS5 HTML5 Pack is downloadable now from Adobe Labs. Maybe Adobe is bowing to pressure from ardent Web professionals like Scott Kellum, Lead Designer at Channel V Media,  a digital and offline branding firm, serving the media and marketing sectors, among others.  Kellum told me that HTML 5 “…will definitely move people away from Flash. It has many of the same functionalities with faster load times and better accessibility. HTML5 will help Flash as well: with the new caching methods you can now even run Flash apps offline.” Although all three Web developers I interviewed would agree that Flash is still required for more sophisticated applications, Kellum seems to have put his finger on why HTML 5 may nonetheless dominate.  In his view, much of the Web development out there has little need for high-end capabilities: “Most people want to add a little punch to a navigation bar or some video and now you can get the biggest bang for your buck with HTML5, CSS3 and Javascript.” I’ve already mentioned that Google’s ongoing I/O conference, at the Moscone West center in San Francisco, is driving the HTML 5 news cycle, big time.  And Google made many announcements of their own, including the open sourcing of their VP8 video codec, new enterprise-oriented capabilities for its App Engine cloud offering, and the creation of the Chrome Web Store, which the company says will make it easier to find and “install” Web applications, in a fashion similar to  the way users procure native apps on various mobile platforms. HTML 5 looks to be disruptive, especially to the media world.  And even if the technology ends up disappointing, the chatter around it alone is causing big changes in the technology world.  If the richness it promises delivers, then magazine publishers and non-text digital advertisers may indeed have a platform for creating compelling content that loads quickly, is standards-based and will render identically in (the newest versions of) all major Web browsers.  Can this development in the digital arena save the titans of the print world?  I can’t predict, but it’s going to be fun to watch, and the competitive innovation from all players in both industries will likely be immense.

    Read the article

  • What steps can you take to ensure sane build environments when compiling software?

    - by Chris Adams
    Hi guys, I've been stuck with a compilation problem when building a standardised virtual machine on CentOS 5.4, and I'm in the dark here as to a) why this error is occurring, and b) how to fix it, and in the hope that someone else stumbles across this problem too, I'm hoping someone can help me find the solution here. I'm getting a configure: error: newly created file is older than distributed files! error when trying to compile Ruby Enterprise like below when I try to run the installer, and the solutions offered to on the forums (of checking the tine, and touching the files to update the time associated with them) don't seem to be helping here. What steps can I take to work out what the cause of this problem? [vagrant@vagrant-centos-5 ruby-enterprise-1.8.7-2009.10]$ sudo ./installer Welcome to the Ruby Enterprise Edition installer This installer will help you install Ruby Enterprise Edition 1.8.7-2009.10. Don't worry, none of your system files will be touched if you don't want them to, so there is no risk that things will screw up. You can expect this from the installation process: 1. Ruby Enterprise Edition will be compiled and optimized for speed for this system. 2. Ruby on Rails will be installed for Ruby Enterprise Edition. 3. You will learn how to tell Phusion Passenger to use Ruby Enterprise Edition instead of regular Ruby. Press Enter to continue, or Ctrl-C to abort. Checking for required software... * C compiler... found at /usr/bin/gcc * C++ compiler... found at /usr/bin/g++ * The 'make' tool... found at /usr/bin/make * Zlib development headers... found * OpenSSL development headers... found * GNU Readline development headers... found -------------------------------------------- Target directory Where would you like to install Ruby Enterprise Edition to? (All Ruby Enterprise Edition files will be put inside that directory.) [/opt/ruby-enterprise] : -------------------------------------------- Compiling and optimizing the memory allocator for Ruby Enterprise Edition In the mean time, feel free to grab a cup of coffee. ./configure --prefix=/opt/ruby-enterprise --disable-dependency-tracking checking build system type... i686-pc-linux-gnu checking host system type... i686-pc-linux-gnu checking for a BSD-compatible install... /usr/bin/install -c checking whether build environment is sane... configure: error: newly created file is older than distributed files! Check your system clock This is a virtual machine running on virtualbox, and the time of the host and the virtual machine are identical, and up to date. I've also tried running this after updating time with an ntp-client, so no avail. I tried this after reading this post here of someone having a similar problem [vagrant@vagrant-centos-5 ruby-enterprise-1.8.7-2009.10]$ date Tue Apr 27 08:09:05 BST 2010 The other approach I've tried is to touch the top level the files in the build folder like suggested here, but this hasn't worked either (an to be honest, I'm not sure why it would have worked either) [vagrant@vagrant-centos-5 ruby-enterprise-1.8.7-2009.10]$ sudo touch ruby-enterprise-1.8.7-2009.10/* I'm not sure what I can do next here - the problem seems to be the bash configure script that returns this error error: newly created file is older than distributed files!, at line :2214 { echo "$as_me:$LINENO: checking whether build environment is sane" >&5 echo $ECHO_N "checking whether build environment is sane... $ECHO_C" >&6; } # Just in case sleep 1 echo timestamp > conftest.file # Do `set' in a subshell so we don't clobber the current shell's # arguments. Must try -L first in case configure is actually a # symlink; some systems play weird games with the mod time of symlinks # (eg FreeBSD returns the mod time of the symlink's containing # directory). if ( set X `ls -Lt $srcdir/configure conftest.file 2> /dev/null` if test "$*" = "X"; then # -L didn't work. set X `ls -t $srcdir/configure conftest.file` fi rm -f conftest.file if test "$*" != "X $srcdir/configure conftest.file" \ && test "$*" != "X conftest.file $srcdir/configure"; then # If neither matched, then we have a broken ls. This can happen # if, for instance, CONFIG_SHELL is bash and it inherits a # broken ls alias from the environment. This has actually # happened. Such a system could not be considered "sane". { { echo "$as_me:$LINENO: error: ls -t appears to fail. Make sure there is not a broken alias in your environment" >&5 echo "$as_me: error: ls -t appears to fail. Make sure there is not a broken alias in your environment" >&2;} { (exit 1); exit 1; }; } fi ### PROBLEM LINE #### # this line is the problem line - this is returned true, sometimes it isn't and I can't # see a pattern that that determines when this will test will pass or not. test "$2" = conftest.file ) then # Ok. : else { { echo "$as_me:$LINENO: error: newly created file is older than distributed files! Check your system clock" >&5 echo "$as_me: error: newly created file is older than distributed files! Check your system clock" >&2;} { (exit 1); exit 1; }; } fi the thing that makes this really frustrating is that this script works sometimes, when the VM has been running for an hour or so it works, but not at boot. There's nothing I see in the crontab that suggests any hourly tasks are run that might change the state of the system enough make a difference to this script working. I'm totally at a loss when it comes to debugging beyond here. What's the best approach to take here? Thanks

    Read the article

  • Set-Cookie Headers getting stripped in ASP.NET HttpHandlers

    - by Rick Strahl
    Yikes, I ran into a real bummer of an edge case yesterday in one of my older low level handler implementations (for West Wind Web Connection in this case). Basically this handler is a connector for a backend Web framework that creates self contained HTTP output. An ASP.NET Handler captures the full output, and then shoves the result down the ASP.NET Response object pipeline writing out the content into the Response.OutputStream and seperately sending the HttpHeaders in the Response.Headers collection. The headers turned out to be the problem and specifically Http Cookies, which for some reason ended up getting stripped out in some scenarios. My handler works like this: Basically the HTTP response from the backend app would return a full set of HTTP headers plus the content. The ASP.NET handler would read the headers one at a time and then dump them out via Response.AppendHeader(). But I found that in some situations Set-Cookie headers sent along were simply stripped inside of the Http Handler. After a bunch of back and forth with some folks from Microsoft (thanks Damien and Levi!) I managed to pin this down to a very narrow edge scenario. It's easiest to demonstrate the problem with a simple example HttpHandler implementation. The following simulates the very much simplified output generation process that fails in my handler. Specifically I have a couple of headers including a Set-Cookie header and some output that gets written into the Response object.using System.Web; namespace wwThreads { public class Handler : IHttpHandler { /* NOTE: * * Run as a web.config set handler (see entry below) * * Best way is to look at the HTTP Headers in Fiddler * or Chrome/FireBug/IE tools and look for the * WWHTREADSID cookie in the outgoing Response headers * ( If the cookie is not there you see the problem! ) */ public void ProcessRequest(HttpContext context) { HttpRequest request = context.Request; HttpResponse response = context.Response; // If ClearHeaders is used Set-Cookie header gets removed! // if commented header is sent... response.ClearHeaders(); response.ClearContent(); // Demonstrate that other headers make it response.AppendHeader("RequestId", "asdasdasd"); // This cookie gets removed when ClearHeaders above is called // When ClearHEaders is omitted above the cookie renders response.AppendHeader("Set-Cookie", "WWTHREADSID=ThisIsThEValue; path=/"); // *** This always works, even when explicit // Set-Cookie above fails and ClearHeaders is called //response.Cookies.Add(new HttpCookie("WWTHREADSID", "ThisIsTheValue")); response.Write(@"Output was created.<hr/> Check output with Fiddler or HTTP Proxy to see whether cookie was sent."); } public bool IsReusable { get { return false; } } } } In order to see the problem behavior this code has to be inside of an HttpHandler, and specifically in a handler defined in web.config with: <add name=".ck_handler" path="handler.ck" verb="*" type="wwThreads.Handler" preCondition="integratedMode" /> Note: Oddly enough this problem manifests only when configured through web.config, not in an ASHX handler, nor if you paste that same code into an ASPX page or MVC controller. What's the problem exactly? The code above simulates the more complex code in my live handler that picks up the HTTP response from the backend application and then peels out the headers and sends them one at a time via Response.AppendHeader. One of the headers in my app can be one or more Set-Cookie. I found that the Set-Cookie headers were not making it into the Response headers output. Here's the Chrome Http Inspector trace: Notice, no Set-Cookie header in the Response headers! Now, running the very same request after removing the call to Response.ClearHeaders() command, the cookie header shows up just fine: As you might expect it took a while to track this down. At first I thought my backend was not sending the headers but after closer checks I found that indeed the headers were set in the backend HTTP response, and they were indeed getting set via Response.AppendHeader() in the handler code. Yet, no cookie in the output. In the simulated example the problem is this line:response.AppendHeader("Set-Cookie", "WWTHREADSID=ThisIsThEValue; path=/"); which in my live code is more dynamic ( ie. AppendHeader(token[0],token[1[]) )as it parses through the headers. Bizzaro Land: Response.ClearHeaders() causes Cookie to get stripped Now, here is where it really gets bizarre: The problem occurs only if: Response.ClearHeaders() was called before headers are added It only occurs in Http Handlers declared in web.config Clearly this is an edge of an edge case but of course - knowing my relationship with Mr. Murphy - I ended up running smack into this problem. So in the code above if you remove the call to ClearHeaders(), the cookie gets set!  Add it back in and the cookie is not there. If I run the above code in an ASHX handler it works. If I paste the same code (with a Response.End()) into an ASPX page, or MVC controller it all works. Only in the HttpHandler configured through Web.config does it fail! Cue the Twilight Zone Music. Workarounds As is often the case the fix for this once you know the problem is not too difficult. The difficulty lies in tracking inconsistencies like this down. Luckily there are a few simple workarounds for the Cookie issue. Don't use AppendHeader for Cookies The easiest and obvious solution to this problem is simply not use Response.AppendHeader() to set Cookies. Duh! Under normal circumstances in application level code there's rarely a reason to write out a cookie like this:response.AppendHeader("Set-Cookie", "WWTHREADSID=ThisIsThEValue; path=/"); but rather create the cookie using the Response.Cookies collection:response.Cookies.Add(new HttpCookie("WWTHREADSID", "ThisIsTheValue")); Unfortunately, in my case where I dynamically read headers from the original output and then dynamically  write header key value pairs back  programmatically into the Response.Headers collection, I actually don't look at each header specifically so in my case the cookie is just another header. My first thought was to simply trap for the Set-Cookie header and then parse out the cookie and create a Cookie object instead. But given that cookies can have a lot of different options this is not exactly trivial, plus I don't really want to fuck around with cookie values which can be notoriously brittle. Don't use Response.ClearHeaders() The real mystery in all this is why calling Response.ClearHeaders() prevents a cookie value later written with Response.AppendHeader() to fail. I fired up Reflector and took a quick look at System.Web and HttpResponse.ClearHeaders. There's all sorts of resetting going on but nothing that seems to indicate that headers should be removed later on in the request. The code in ClearHeaders() does access the HttpWorkerRequest, which is the low level interface directly into IIS, and so I suspect it's actually IIS that's stripping the headers and not ASP.NET, but it's hard to know. Somebody from Microsoft and the IIS team would have to comment on that. In my application it's probably safe to simply skip ClearHeaders() in my handler. The ClearHeaders/ClearContent was mainly for safety but after reviewing my code there really should never be a reason that headers would be set prior to this method firing. However, if for whatever reason headers do need to be cleared, it's easy enough to manually clear the headers out:private void RemoveHeaders(HttpResponse response) { List<string> headers = new List<string>(); foreach (string header in response.Headers) { headers.Add(header); } foreach (string header in headers) { response.Headers.Remove(header); } response.Cookies.Clear(); } Now I can replace the call the Response.ClearHeaders() and I don't get the funky side-effects from Response.ClearHeaders(). Summary I realize this is a total edge case as this occurs only in HttpHandlers that are manually configured. It looks like you'll never run into this in any of the higher level ASP.NET frameworks or even in ASHX handlers - only web.config defined handlers - which is really, really odd. After all those frameworks use the same underlying ASP.NET architecture. Hopefully somebody from Microsoft has an idea what crazy dependency was triggered here to make this fail. IAC, there are workarounds to this should you run into it, although I bet when you do run into it, it'll likely take a bit of time to find the problem or even this post in a search because it's not easily to correlate the problem to the solution. It's quite possible that more than cookies are affected by this behavior. Searching for a solution I read a few other accounts where headers like Referer were mysteriously disappearing, and it's possible that something similar is happening in those cases. Again, extreme edge case, but I'm writing this up here as documentation for myself and possibly some others that might have run into this. © Rick Strahl, West Wind Technologies, 2005-2012Posted in ASP.NET   IIS7   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • openGL migration from SFML to glut, vertices arrays or display lists are not displayed

    - by user3714670
    Due to using quad buffered stereo 3D (which i have not included yet), i need to migrate my openGL program from a SFML window to a glut window. With SFML my vertices and display list were properly displayed, now with glut my window is blank white (or another color depending on the way i clear it). Here is the code to initialise the window : int type; int stereoMode = 0; if ( stereoMode == 0 ) type = GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH; else type = GLUT_DOUBLE | GLUT_RGB | GLUT_DEPTH | GLUT_STEREO; glutInitDisplayMode(type); int argc = 0; char *argv = ""; glewExperimental = GL_TRUE; glutInit(&argc, &argv); bool fullscreen = false; glutInitWindowSize(width,height); int win = glutCreateWindow(title.c_str()); glutSetWindow(win); assert(win != 0); if ( fullscreen ) { glutFullScreen(); width = glutGet(GLUT_SCREEN_WIDTH); height = glutGet(GLUT_SCREEN_HEIGHT); } GLenum err = glewInit(); if (GLEW_OK != err) { fprintf(stderr, "Error: %s\n", glewGetErrorString(err)); } glutDisplayFunc(loop_function); This is the only code i had to change for now, but here is the code i used with sfml and displayed my objects in the loop, if i change the value of glClearColor, the window's background does change color : glClear(GL_COLOR_BUFFER_BIT | GL_DEPTH_BUFFER_BIT); glClearColor(255.0f, 255.0f, 255.0f, 0.0f); glLoadIdentity(); sf::Time elapsed_time = clock.getElapsedTime(); clock.restart(); camera->animate(elapsed_time.asMilliseconds()); camera->look(); for (auto i = objects->cbegin(); i != objects->cend(); ++i) (*i)->draw(camera); glutSwapBuffers(); Is there any other changes i should have done switching to glut ? that would be great if someone could enlighten me on the subject. In addition to that, i found out that adding too many objects (that were well handled before with SFML), openGL gives error 1285: out of memory. Maybe this is related. EDIT : Here is the code i use to draw each object, maybe it is the problem : GLuint LightID = glGetUniformLocation(this->shaderProgram, "LightPosition_worldspace"); if(LightID ==-1) cout << "LightID not found ..." << endl; GLuint MaterialAmbientID = glGetUniformLocation(this->shaderProgram, "MaterialAmbient"); if(LightID ==-1) cout << "LightID not found ..." << endl; GLuint MaterialSpecularID = glGetUniformLocation(this->shaderProgram, "MaterialSpecular"); if(LightID ==-1) cout << "LightID not found ..." << endl; glm::vec3 lightPos = glm::vec3(0,150,150); glUniform3f(LightID, lightPos.x, lightPos.y, lightPos.z); glUniform3f(MaterialAmbientID, MaterialAmbient.x, MaterialAmbient.y, MaterialAmbient.z); glUniform3f(MaterialSpecularID, MaterialSpecular.x, MaterialSpecular.y, MaterialSpecular.z); // Get a handle for our "myTextureSampler" uniform GLuint TextureID = glGetUniformLocation(shaderProgram, "myTextureSampler"); if(!TextureID) cout << "TextureID not found ..." << endl; glActiveTexture(GL_TEXTURE0); sf::Texture::bind(texture); glUniform1i(TextureID, 0); // 2nd attribute buffer : UVs GLuint vertexUVID = glGetAttribLocation(shaderProgram, "color"); if(vertexUVID==-1) cout << "vertexUVID not found ..." << endl; glEnableVertexAttribArray(vertexUVID); glBindBuffer(GL_ARRAY_BUFFER, color_array_buffer); glVertexAttribPointer(vertexUVID, 2, GL_FLOAT, GL_FALSE, 0, 0); GLuint vertexNormal_modelspaceID = glGetAttribLocation(shaderProgram, "normal"); if(!vertexNormal_modelspaceID) cout << "vertexNormal_modelspaceID not found ..." << endl; glEnableVertexAttribArray(vertexNormal_modelspaceID); glBindBuffer(GL_ARRAY_BUFFER, normal_array_buffer); glVertexAttribPointer(vertexNormal_modelspaceID, 3, GL_FLOAT, GL_FALSE, 0, 0 ); GLint posAttrib; posAttrib = glGetAttribLocation(shaderProgram, "position"); if(!posAttrib) cout << "posAttrib not found ..." << endl; glEnableVertexAttribArray(posAttrib); glBindBuffer(GL_ARRAY_BUFFER, position_array_buffer); glVertexAttribPointer(posAttrib, 3, GL_FLOAT, GL_FALSE, 0, 0); glBindBuffer(GL_ELEMENT_ARRAY_BUFFER, elements_array_buffer); glDrawElements(GL_TRIANGLES, indices.size(), GL_UNSIGNED_INT, 0); GLuint error; while ((error = glGetError()) != GL_NO_ERROR) { cerr << "OpenGL error: " << error << endl; } disableShaders();

    Read the article

  • Check Your Spelling, Grammar, and Style in Firefox and Chrome

    - by Matthew Guay
    Are you tired of making simple writing mistakes that get past your browser’s spell-check?  Here’s how you can get advanced grammar check and more in Firefox and Chrome with After the Deadline. Microsoft Word has spoiled us with grammar, syntax, and spell checking, but the default spell check in Firefox and Chrome still only does basic checks.  Even webapps like Google Docs don’t check more than basic spelling errors.  However, WordPress.com is an exception; it offers advanced spelling, grammar, and syntax checking with its After the Deadline proofing system.  This helps you keep from making embarrassing mistakes on your blog posts, and now, thanks to a couple free browser plugins, it can help you keep from making these mistakes in any website or webapp. After the Deadline in Google Chrome Add the After the Deadline extension (link below) to Chrome as usual. As soon as it’s installed, you’re ready to start improving your online writing.  To check spelling, grammar, and more, click the ABC button that you’ll now see at the bottom of most text boxes online. After a quick scan, grammar mistakes are highlighted in green, complex expressions and other syntax problems are highlighted in blue, and spelling mistakes are highlighted in red as would be expected.  Click on an underlined word to choose one of its recommended changes or ignore the suggestion. Or, if you want more explanation about what was wrong with that word or phrase, click Explain for more info. And, if you forget to run an After the Deadline scan before submitting a text entry, it will automatically check to make sure you still want to submit it.  Click Cancel to go back and check your writing first.   To change the After the Deadline settings, click its icon in the toolbar and select View Options.  Additionally, if you want to disable it on the site you’re on, you can click Disable on this site directly from the popup. From the settings page, you can choose extra things to check for such as double negatives and redundant phrases, as well as add sites and words to ignore. After the Deadline in Firefox Add the After the Deadline add-on to Firefox (link below) as normal. After the Deadline basically the same in Firefox as it does in Chrome.  Select the ABC icon in the lower right corner of textboxes to check them for problems, and After the Deadline will underline the problems as it did in Chrome.  To view a suggested change in Firefox, right-click on the underlined word and select the recommended change or ignore the suggestion. And, if you forget to check, you’ll see a friendly reminder asking if you’re sure you want to submit your text like it is. You can access the After the Deadline settings in Firefox from the menu bar.  Click Tools, then select AtD Preferences.  In Firefox, the settings are in a options dialog with three tabs, but it includes the same options as the Chrome settings page.  Here you can make After the Deadline as correction-happy as you like.   Conclusion The web has increasingly become an interactive place, and seldom does a day go by that we aren’t entering text in forms and comments that may stay online forever.  Even our insignificant tweets are being archived in the Library of Congress.  After the Deadline can help you make sure that your permanent internet record is as grammatically correct as possible.  Even though it doesn’t catch every problem, and even misses some spelling mistakes, it’s still a great help. Links Download the After the Deadline extension for Google Chrome Download the After the Deadline add-on for Firefox Similar Articles Productive Geek Tips Quick Tip: Disable Favicons in FirefoxStupid Geek Tricks: Duplicate a Tab with a Shortcut Key in Chrome or FirefoxHow to Disable the New Geolocation Feature in Google ChromeStupid Geek Tricks: Compare Your Browser’s Memory Usage with Google ChromeStop YouTube Videos from Automatically Playing in Chrome TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips Acronis Online Backup DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows Easily Search Food Recipes With Recipe Chimp Tech Fanboys Field Guide Check these Awesome Chrome Add-ons iFixit Offers Gadget Repair Manuals Online Vista style sidebar for Windows 7 Create Nice Charts With These Web Based Tools

    Read the article

  • CodePlex Daily Summary for Sunday, July 14, 2013

    CodePlex Daily Summary for Sunday, July 14, 2013Popular ReleasesVidCoder: 1.4.23: New in 1.4.23 Added French translation. Fixed non-x264 video encoders not sticking in video tab. New in 1.4 Updated HandBrake core to 0.9.9 Blu-ray subtitle (PGS) support Additional framerates: 30, 50, 59.94, 60 Additional sample rates: 8, 11.025, 12 and 16 kHz Additional higher bitrates for audio Same as Source Constant Framerate 24-bit FLAC encoding Added Windows Phone 8 and Apple TV 3 presets Introduced process isolation for encodes. Now if HandBrake crashes, VidCoder will ...Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.96: Fix for issue #19957: EXE should output the name of the file(s) being minified. Discussion #449181: throw a Sev-2 warning when trailing commas are detected on an Array literal. Perfectly legal to do so, but the behavior ends up working differently on different browsers, so throw a cross-browser warning. Add a few more known global names for improved ES6 compatibility update Nuget package to version 2.5 and automatically add the AjaxMin.targets to your project when you update the package...JSLint.NET: JSLint.NET 0.9.1 Beta: Version 0.9.1 Beta includes: JSLint.NET Console: Console help available using the /? switch (or no arguments). See the Console Options page for more. JSLint.NET for Visual Studio: Support linked JSLintNet.json file. More about this file on the JSLint.NET Settings page. Hide un-implemented menu items. Prefix JSLint errors in the task list. JSLint.NET Core: Allow ignoring of individual files in JSLintNet.json.TypePipe: 1.15.2.0 (.NET 4.5): This is build 1.15.2.0 of the TypePipe for .NET 4.5. Find the complete release notes for the build here: Release Notes.re-linq: 1.15.2.0 (.NET 4.5): This is build 1.15.2.0 of re-linq for .NET 4.5. Find the complete release notes for the build here: Release Notes To use re-linq with .NET 3.5, use a 1.13.x build.Columbus Remote Desktop: 2.0 Sapphire: Added configuration settings Added update notifications Added ability to disable GPU acceleration Fixed connection bugsSearch for Team Foundation Server workitems changes: Release 1.2: - Issue 1184 fixed, - Changeset's comboboxes sorted by Id (From : Ascending - To : Descending) - Application window iconImpulse Media Player: Impulse Media Player 3.5.0.1: Fixed a crash that occurs when copying data from lastfm to file panelPhoneGuitarTab: Release 1.1: Improved UX. Simplified navigation. More performance improvements coming soon.The GLMET Project: Get OS Version: --DataDevelop: Beta 0.6.5: Hotfix bug in Python Table.ImportAll method Updated External Libraries Fixes in Excel Exportation Modify ConnectionString refreshes the Properties Window correctlyUser Group Labs: User Group Data: 01.00.00: This release has the following updates and new features: Initial release with a minimal feature set Easy to use (just add to the social group details page) Edit common user group properties System Requirements DNN v07.00.02 or newer .Net Framework v4.0 or newerCarrotCake, an ASP.Net WebForms CMS: Binaries and PDFs - Zip Archive (v. 4.3 20130709): Product documentation and additional templates for this version is included in the zip archive, or if you want individual files, visit the http://www.carrotware.com website. Templates, in addition to those found in the download, can be downloaded individually from the website as well. If you are coming from earlier versions, make a precautionary backup of your existing website files and database. When installing the update, the database update engine will create the new schema items (if you...Dalmatian Build Script: Dalmatian Build 0.1.3.0: -Minor bug fixes -Added Choose<T> and ChooseYesNo to Console objectPushover.NET: Pushover.NET - Stable Release 10 July 2013: This is the first stable release of Pushover.NET. It includes 14 overloads of the SendNotification method, giving you total flexibility of sending Pushover notifications from your client. Assembly is built to .NET 2.x so it can be called from .NET 2.x, 3.x and 4.x projects. Also available is the Test Harness. This is a small GUI that you can use to test calls to Pushover.NET's main DLL. It's almost fully functional--the sound effects haven't been fully configured so no matter what you pick ...MCEBuddy 2.x: MCEBuddy 2.3.14: 2.3.14 BETA is available through the Early Access Program.Click here https://mcebuddy2x.codeplex.com/discussions/439439 for details and to get access to Early Access Program to download latest releases. Changelog for 2.3.14 (32bit and 64bit) NEW FEATURES: 1. ENHANCEMENTS: 2. Improved eMail notifications 3. Improved metrics details 4. Support for larger history (INI) file (about 45,000 sections, each section can have about 1500 entries) BUG FIXES: 5. Fix for extracting Movie release year from...Azure Depot: Flask: Flask Version 01LINQ to Twitter: LINQ to Twitter v2.1.07: Supports .NET 3.5, .NET 4.0, .NET 4.5, Silverlight 4.0, Windows Phone 7.1, Windows Phone 8, Client Profile, Windows 8, and Windows Azure. 100% Twitter API coverage. Also supports Twitter API v1.1! Also on NuGet.DotNetNuke® Community Edition CMS: 06.02.08: Major Highlights Fixed issue where the application throws an Unhandled Error and an HTTP Response Code of 200 when the connection to the database is lost. Security FixesNone Updated Modules/Providers ModulesNone ProvidersNoneModern UI for WPF: Modern UI 1.0.5: The ModernUI assembly including a demo app demonstrating the various features of Modern UI for WPF. BREAKING CHANGE This version is backwards incompatible. ModernDialog.ShowMessage returns MessageBoxResult instead of bool? Related downloads NuGet ModernUI for WPF is also available as NuGet package in the NuGet gallery, id: ModernUI.WPF Download Modern UI for WPF Templates A Visual Studio 2012 extension containing a collection of project and item templates for Modern UI for WPF. The extensi...New ProjectsA Domain-Driven Design Framework for .Net: A .Net framework library for applying the domain-driven design approach to develop business software.a Linq to Workitem provider: Wilinq is a linq to workitem provider. It also contains WIQL to expression tree parser. Wilinq is based on the the fissum project source codeApprentice for WP: Apprentice for WPArgo New Deal: Data Type DBL DAL UI ToolsC# Practice: C# PracticeDardemEvo: summaryDavid.A.Zhang: Personal class LibrayEnglish Practice Helper: English Practice Helper is a C# window form application for everyone want to practice writing,speaking,listening and reading skill with your OWN computerFinancialManagement: FinancialManagementGoAgent GUI: GoAgent??????。GoAgent: https://code.google.com/p/goagentIndustrial Programming: Industrial Programming approaches tips (it's old and in russian language)ISS.IR.RRN-MS: Summary Tany :PLifeDataManager: Web project to manage some dataMixERP - ERP Solution That Sucks Less: A humble ERP solution that does not scare the users, MixERP is a purely mult-establishment and multi-currency solution.Nokia Portal: Install Nokia, HTC and LG apps on any WP8 devicePenn State SWENG 581 Team 5 Su13.2: This project is an academic extension of the NClass project found at http://nclass.sourceforge.net for the purposes of software testing and quality assurance.Pomp: testProfessor Oak's Pokemon Library DotNet: The Professor Oak's Pokemon Library is a .NET class library that aims to help programmers, by providing different tools to modify the game memory.Pure Music Player: Pure Music PlayerRandomly Balanced Trees: C# Implementations of Treap and Skiplist data structures. Which are representations of randomly balanced binary trees.ReoScript: JavaScript-like script language engine for .NET application. Easy to plug in .NET program and make API extension for script. SQL Queries: This is for all developers help.SqlSetup: This project create SQL server database automatically. Truco Pythons: Truco Argentino (Argentine truc), is a card game developed in python by Argentine programmers of the UNGS (General Sarmiento National University). WebServer .NET: Projekt zawiera oprogramowanie i zestaw narzedzi do zarzadzania serwerem http. Posiada wiele funkcjonalnosci ulatwiajacych korzystanie i konfigurowanie serwera.workspaces: solr exampleWP8NativeAccess: Win32 API wrappers for Windows Phone 8. Intended to be used in WP8 WinRT apps. Includes FileSystem project.

    Read the article

  • Part 2&ndash;Load Testing In The Cloud

    - by Tarun Arora
    Welcome to Part 2, In Part 1 we discussed the advantages of creating a Test Rig in the cloud, the Azure edge and the Test Rig Topology we want to get to. In Part 2, Let’s start by understanding the components of Azure we’ll be making use of followed by manually putting them together to create the test rig, so… let’s get down dirty start setting up the Test Rig.  What Components of Azure will I be using for building the Test Rig in the Cloud? To run the Test Agents we’ll make use of Windows Azure Compute and to enable communication between Test Controller and Test Agents we’ll make use of Windows Azure Connect.  Azure Connect The Test Controller is on premise and the Test Agents are in the cloud (How will they talk?). To enable communication between the two, we’ll make use of Windows Azure Connect. With Windows Azure Connect, you can use a simple user interface to configure IPsec protected connections between computers or virtual machines (VMs) in your organization’s network, and roles running in Windows Azure. With this you can now join Windows Azure role instances to your domain, so that you can use your existing methods for domain authentication, name resolution, or other domain-wide maintenance actions. For more details refer to an overview of Windows Azure connect. A very useful video explaining everything you wanted to know about Windows Azure connect.  Azure Compute Windows Azure compute provides developers a platform to host and manage applications in Microsoft’s data centres across the globe. A Windows Azure application is built from one or more components called ‘roles.’ Roles come in three different types: Web role, Worker role, and Virtual Machine (VM) role, we’ll be using the Worker role to set up the Test Agents. A very nice blog post discussing the difference between the 3 role types. Developers are free to use the .NET framework or other software that runs on Windows with the Worker role or Web role. Developers can also create applications using languages such as PHP and Java. More on Windows Azure Compute. Each Windows Azure compute instance represents a virtual server... Virtual Machine Size CPU Cores Memory Cost Per Hour Extra Small Shared 768 MB $0.04 Small 1 1.75 GB $0.12 Medium 2 3.50 GB $0.24 Large 4 7.00 GB $0.48 Extra Large 8 14.00 GB $0.96   You might want to review the Windows Azure Pricing FAQ. Let’s Get Started building the Test Rig… Configuration Machine Role Comments VM – 1 Domain Controller for Playpit.com On Premise VM – 2 TFS, Test Controller On Premise VM – 3 Test Agent Cloud   In this blog post I would assume that you have the domain, Team Foundation Server and Test Controller Installed and set up already. If not, please refer to the TFS 2010 Installation Guide and this walkthrough on MSDN to set up your Test Controller. You can also download a preconfigured TFS 2010 VM from Brian Keller's blog, Brian also has some great hands on Labs on TFS 2010 that you may want to explore. I. Lets start building VM – 3: The Test Agent Download the Windows Azure SDK and Tools Open Visual Studio and create a new Windows Azure Project using the Cloud Template                   Choose the Worker Role for reasons explained in the earlier post         The WorkerRole.cs implements the Run() and OnStart() methods, no code changes required. You should be able to compile the project and run it in the compute emulator (The compute emulator should have been installed as part of the Windows Azure Toolkit) on your local machine.                   We will only be making changes to WindowsAzureProject, open ServiceDefinition.csdef. Ensure that the vmsize is small (remember the cost chart above). Import the “Connect” module. I am importing the Connect module because I need to join the Worker role VM to the Playpit domain. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect"/> </Imports> </WorkerRole> </ServiceDefinition> Go to the ServiceConfiguration.Cloud.cscfg and note that settings with key ‘Microsoft.WindowsAzure.Plugins.Connect.%%%%’ have been added to the configuration file. This is because you decided to import the connect module. See the config below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration>             Let’s go step by step and understand all the highlighted parameters and where you can find the values for them.       osFamily – By default this is set to 1 (Windows Server 2008 SP2). Change this to 2 if you want the Windows Server 2008 R2 operating system. The Advantage of using osFamily = “2” is that you get Powershell 2.0 rather than Powershell 1.0. In Powershell 2.0 you could simply use “powershell -ExecutionPolicy Unrestricted ./myscript.ps1” and it will work while in Powershell 1.0 you will have to change the registry key by including the following in your command file “reg add HKLM\Software\Microsoft\PowerShell\1\ShellIds\Microsoft.PowerShell /v ExecutionPolicy /d Unrestricted /f” before you can execute any power shell. The other reason you might want to move to os2 is if you wanted IIS 7.5.       Activation Token – To enable communication between the on premise machine and the Windows Azure Worker role VM both need to have the same token. Log on to Windows Azure Management Portal, click on Connect, click on Get Activation Token, this should give you the activation token, copy the activation token to the clipboard and paste it in the configuration file. Note – Later in the blog I’ll be showing you how to install connect on the on premise machine.                       EnableDomainJoin – Set the value to true, ofcourse we want to join the on windows azure worker role VM to the domain.       DomainFQDN, DomainControllerFQDN, DomainAccountName, DomainPassword, DomainOU, Administrators – This information is specific to your domain. I have extracted this information from the ‘service manager’ and ‘Active Directory Users and Computers’. Also, i created a new Domain-OU namely ‘CloudInstances’ so all my cloud instances joined to my domain show up here, this is optional. You can encrypt the DomainPassword – refer to the instructions here. Or hold fire, I’ll be covering that when i come to certificates and encryption in the coming section.       Now once you have filled all this information up, the configuration file should look something like below, <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> </ConfigurationSettings> </Role> </ServiceConfiguration> Next we will be enabling the Remote Desktop module in to the ServiceDefinition.csdef, we could make changes manually or allow a beautiful wizard to help us make changes. I prefer the second option. So right click on the Windows Azure project and choose Publish       Now once you get the publish wizard, if you haven’t already you would be asked to import your Windows Azure subscription, this is simply the Msdn subscription activation key xml. Once you have done click Next to go to the Settings page and check ‘Enable Remote Desktop for all roles’.       As soon as you do that you get another pop up asking you the details for the user that you would be logging in with (make sure you enter a reasonable expiry date, you do not want the user account to expire today). Notice the more information tag at the bottom, click that to get access to the certificate section. See screen shot below.       From the drop down select the option to create a new certificate        In the pop up window enter the friendly name for your certificate. In my case I entered ‘WAC – Test Rig’ and click ok. This will create a new certificate for you. Click on the view button to see the certificate details. Do you see the Thumbprint, this is the value that will go in the config file (very important). Now click on the Copy to File button to copy the certificate, we will need to import the certificate to the windows Azure Management portal later. So, make sure you save it a safe location.                                Click Finish and enter details of the user you would like to create with permissions for remote desktop access, once you have entered the details on the ‘Remote desktop configuration’ screen click on Ok. From the Publish Windows Azure Wizard screen press Cancel. Cancel because we don’t want to publish the role just yet and Yes because we want to save all the changes in the config file.       Now if you go to the ServiceDefinition.csdef file you will see that the RemoteAccess and RemoteForwarder roles have been imported for you. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceDefinition"> <WorkerRole name="WorkerRole1" vmsize="Small"> <Imports> <Import moduleName="Diagnostics" /> <Import moduleName="Connect" /> <Import moduleName="RemoteAccess" /> <Import moduleName="RemoteForwarder" /> </Imports> </WorkerRole> </ServiceDefinition> Now go to the ServiceConfiguration.Cloud.cscfg file and you see a whole bunch for setting “Microsoft.WindowsAzure.Plugins.RemoteAccess.%%%” values added for you. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="WindowsAzureProject2" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="2" osVersion="*"> <Role name="WorkerRole1"> <Instances count="1" /> <ConfigurationSettings> <Setting name="Microsoft.WindowsAzure.Plugins.Diagnostics.ConnectionString" value="UseDevelopmentStorage=true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.ActivationToken" value="45f55fea-f194-4fbc-b36e-25604faac784" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Refresh" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.WaitForConnectivity" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Upgrade" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.EnableDomainJoin" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainFQDN" value="play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainControllerFQDN" value="WIN-KUDQMQFGQOL.play.pit.com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainAccountName" value="playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainPassword" value="************************" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainOU" value="OU=CloudInstances, DC=Play, DC=Pit, DC=com" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.Administrators" value="Playpit\Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.Connect.DomainSiteName" value="" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.Enabled" value="true" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountUsername" value="Administrator" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountEncryptedPassword" value="MIIBnQYJKoZIhvcNAQcDoIIBjjCCAYoCAQAxggFOMIIBSgIBADAyMB4xHDAaBgNVBAMME1dpbmRvd 3MgQXp1cmUgVG9vbHMCEGa+B46voeO5T305N7TSG9QwDQYJKoZIhvcNAQEBBQAEggEABg4ol5Xol66Ip6QKLbAPWdmD4ae ADZ7aKj6fg4D+ATr0DXBllZHG5Umwf+84Sj2nsPeCyrg3ZDQuxrfhSbdnJwuChKV6ukXdGjX0hlowJu/4dfH4jTJC7sBWS AKaEFU7CxvqYEAL1Hf9VPL5fW6HZVmq1z+qmm4ecGKSTOJ20Fptb463wcXgR8CWGa+1w9xqJ7UmmfGeGeCHQ4QGW0IDSBU6ccg vzF2ug8/FY60K1vrWaCYOhKkxD3YBs8U9X/kOB0yQm2Git0d5tFlIPCBT2AC57bgsAYncXfHvPesI0qs7VZyghk8LVa9g5IqaM Cp6cQ7rmY/dLsKBMkDcdBHuCTAzBgkqhkiG9w0BBwEwFAYIKoZIhvcNAwcECDRVifSXbA43gBApNrp40L1VTVZ1iGag+3O1" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteAccess.AccountExpiration" value="2012-11-27T23:59:59.0000000+00:00" /> <Setting name="Microsoft.WindowsAzure.Plugins.RemoteForwarder.Enabled" value="true" /> </ConfigurationSettings> <Certificates> <Certificate name="Microsoft.WindowsAzure.Plugins.RemoteAccess.PasswordEncryption" thumbprint="AA23016CF0BDFC344400B5B82706B608B92E4217" thumbprintAlgorithm="sha1" /> </Certificates> </Role> </ServiceConfiguration>          Okay let’s look at them one at a time,       Enabled - Yes, we would like to enable Remote Access.       AccountUserName – This is the user name you entered while you were on the publish windows azure role screen, as detailed above.       AccountEncrytedPassword – Try and decode that, the certificate is used to encrypt the password you specified for the user account. Remember earlier i said, either use the instructions or wait and i’ll be showing you encryption, now the user account i am using for rdp has the same password as my domain password, so i can simply copy the value of the AccountEncryptedPassword to the DomainPassword as well.       AccountExpiration – This is the expiration as you specified in the wizard earlier, make sure your account does not expire today.       Remote Forwarder – Check out the documentation, below is how I understand it, -- One role in an application that implements a remote desktop connection must import the RemoteForwarder module. The two modules work together to enable the remote desktop connections to role instances. -- If you have multiple roles defined in the service model, it does not matter which role you add the RemoteForwarder module to, but you must add it to only one of the role definitions.       Certificate – Remember the certificate thumbprint from the wizard, the on premise machine and windows azure role machine that need to speak to each other must have the same thumbprint. More on that when we install Windows Azure connect Endpoints on the on premise machine. As i said earlier, in this blog post, I’ll be showing you the manual process so i won’t be scripting any star up tasks to install the test agent or register the test agent with the TFS Server. I’ll be showing you all this cool stuff in the next blog post, that’s because it’s important to understand the manual side of it, it becomes easier for you to troubleshoot in case something fails. Having said that, the changes we have made are sufficient to spin up the Windows Azure Worker Role aka Test Agent VM, have it connected with the play.pit.com domain and have remote access enabled on it. Before we deploy the Test Agent VM we need to set up Windows Azure Connect on the TFS Server. II. Windows Azure Connect: Setting up Connect on VM – 2 i.e. TFS & Test Controller Glad you made it so far, now to enable communication between the on premise TFS/Test Controller and Azure-ed Test Agent we need to enable communication. We have set up the Azure connect module in the Test Agent configuration, now the connect end points need to be enabled on the on premise machines, let’s have a look at how we can do this. Log on to VM – 2 running the TFS Server and Test Controller Log on to the Windows Azure Management Portal and click on Virtual Network Click on Virtual Network, if you already have a subscription you should see the below screen shot, if not, you would be asked to complete the subscription first        Click on Install Local Endpoints from the top left on the panel and you get a url appended with a token id in it, remember the token i showed you earlier, in theory the token you get here should match the token you added to the Test Agent config file.        Copy the url to the clip board and paste it in IE explorer (important, the installation at present only works out of IE and you need to have cookies enabled in order to complete the installation). As stated in the pop up, you can NOT download and run the software later, you need to run it as is, since it contains a token. Once the installation completes you should see the Windows Azure connect icon in the system tray.                         Right click the Azure Connect icon, choose Diagnostics and refer to this link for diagnostic detail terminology. NOTE – Unfortunately I could not see the Windows Azure connect icon in the system tray, a bit of binging with Google revealed that the azure connect icon is only shown when the ‘Windows Azure Connect Endpoint’ Service is started. So go to services.msc and make sure that the service is started, if not start it, unfortunately again, the service did not start for me on a manual start and i realised that one of the dependant services was disabled, you can look at the service dependencies and start them and then start windows azure connect. Bottom line, you need to start Windows Azure connect service before you can proceed. Please refer here on MSDN for more on Troubleshooting Windows Azure connect. (Follow the next step as well)   Now go back to the Windows Azure Management Portal and from Groups and Roles create a new group, lets call it ‘Test Rig’. Make sure you add the VM – 2 (the TFS Server VM where you just installed the endpoint).       Now if you go back to the Azure Connect icon in the system tray and click ‘Refresh Policy’ you will notice that the disconnected status of the icon should change to ready for connection. III. Importing Certificate in to Windows Azure Management Portal But before that you need to import the certificate you created in Step I in to the Windows Azure Management Portal. Log on to the Windows Azure Management Portal and click on ‘Hosted Services, Storage Accounts & CDN’ and then ‘Management Certificates’ followed by Add Certificates as shown in the screen shot below        Browse to the location where you saved the certificate earlier, remember… Refer to Step I in case you forgot.        Now you should be able to see the imported certificate here, make sure the thumbprint of the certificate matches the one you inserted in the config files        IV. Publish Windows Azure Worker Role aka Test Agent Having completed I, II and III, you are ready to publish the Test Agent VM – 3 to the cloud. Go to Visual Studio and right click the Windows Azure project and select Publish. Verify the infomration in the wizard, from the advanced settings tab, you can also enabled capture of intellitrace or profiling information.         Click Next and Click Publish! From the view menu bar select the Windows Azure Activity Log window.       Now you should be able to see the deployment progress in real time.             In the Windows Azure Management Portal, you should also be able to see the progress of creation of a new Worker Role.       Once the deployment is complete you should be able to RDP (go to run prompt type mstsc and in the pop up the machine name) in to the Test Agent Worker Role VM from the Playpit network using the domain admin user account. In case you are unable to log in to the Test Agent using the domain admin user account it means the process of joining the Test Agent to the domain has failed! But the good news is, because you imported the connect module, you can connect to the Test Agent machine using Windows Azure Management Portal and troubleshoot the reason for failure, you will be able to log in with the user name and password you specified in the config file for the keys ‘RemoteAccess.AccountUsername, RemoteAccess.EncryptedPassword (just that enter the password unencrypted)’, fix it or manually join the machine to the domain. Once you have managed to Join the Test Agent VM to the Domain move to the next step.      So, log in to the Test Agent Worker Role VM with the Playpit Domain Administrator and verify that you can log in, the machine is connected to the domain and the connect service is successfully running. If yes, give your self a pat on the back, you are 80% mission accomplished!         Go to the Windows Azure Management Portal and click on Virtual Network, click on Groups and Roles and click on Test Rig, click Edit Group, the edit the Test Rig group you created earlier. In the Connect to section, click on Add to select the worker role you have just deployed. Also, check the ‘Allow connections between endpoints in the group’ with this you will enable to communication between test controller and test agents and test agents/test agents. Click Save.      Now, you are ready to deploy the Test Agent software on the Worker Role Test Agent VM and configure it to work with the Test Controller. V. Configuring VM – 3: Installing Test Agent and Associating Test Agent to Controller Log in to the Worker Role Test Agent VM that you have just successfully deployed, make sure you log in with the domain administrator account. Download the All Agents software from MSDN, ‘en_visual_studio_agents_2010_x86_x64_dvd_509679.iso’, extract the iso and navigate to where you have extracted the iso. In my case, i have extracted the iso to “C:\Resources\Temp\VsAgentSetup”. Open the Test Agent folder and double click on setup.exe. Once you have installed the Test Agent you should reach the configuration window. If you face any issues installing TFS Test Agent on the VM, refer to the walkthrough on MSDN.       Once you have successfully installed the Test Agent software you will need to configure the test agent. Right click the test agent configuration tool and run as a different user. i.e. an Administrator. This is really to run the configuration wizard with elevated privileges (you might have UAC block something's otherwise).        In the run options, you can select ‘service’ you do not need to run the agent as interactive un less you are running coded UI tests. I have specified the domain administrator to connect to the TFS Test Controller. In real life, i would never do that, i would create a separate test user service account for this purpose. But for the blog post, we are using the most powerful user so that any policies or restrictions don’t block you.        Click the Apply Settings button and you should be all green! If not, the summary usually gives helpful error messages that you can resolve and proceed. As per my experience, you may run in to either a permission or a firewall blocking communication issue.        And now the moment of truth! Go to VM –2 open up Visual Studio and from the Test Menu select Manage Test Controller       Mission Accomplished! You should be able to see the Test Agent that you have just configured here,         VI. Creating and Running Load Tests on your brand new Azure-ed Test Rig I have various blog posts on Performance Testing with Visual Studio Ultimate, you can follow the links and videos below, Blog Posts: - Part 1 – Performance Testing using Visual Studio 2010 Ultimate - Part 2 – Performance Testing using Visual Studio 2010 Ultimate - Part 3 – Performance Testing using Visual Studio 2010 Ultimate Videos: - Test Tools Configuration & Settings in Visual Studio - Why & How to Record Web Performance Tests in Visual Studio Ultimate - Goal Driven Load Testing using Visual Studio Ultimate Now that you have created your load tests, there is one last change you need to make before you can run the tests on your Azure Test Rig, create a new Test settings file, and change the Test Execution method to ‘Remote Execution’ and select the test controller you have configured the Worker Role Test Agent against in our case VM – 2 So, go on, fire off a test run and see the results of the test being executed on the Azur-ed Test Rig. Review and What’s next? A quick recap of the benefits of running the Test Rig in the cloud and what i will be covering in the next blog post AND I would love to hear your feedback! Advantages Utilizing the power of Azure compute to run a heavy virtual user load. Benefiting from the Azure flexibility, destroy Test Agents when not in use, takes < 25 minutes to spin up a new Test Agent. Most important test Network Latency, (network latency and speed of connection are two different things – usually network latency is very hard to test), by placing the Test Agents in Microsoft Data centres around the globe, one can actually test the lag in transferring the bytes not because of a slow connection but because the page has been requested from the other side of the globe. Next Steps The process of spinning up the Test Agents in windows Azure is not 100% automated. I am working on the Worker process and power shell scripts to make the role deployment, unattended install of test agent software and registration of the test agent to the test controller automated. In the next blog post I will show you how to make the complete process unattended and automated. Remember to subscribe to http://feeds.feedburner.com/TarunArora. Hope you enjoyed this post, I would love to hear your feedback! If you have any recommendations on things that I should consider or any questions or feedback, feel free to leave a comment. See you in Part III.   Share this post : CodeProject

    Read the article

  • Unable to cast transparent proxy to type &lt;type&gt;

    - by Rick Strahl
    This is not the first time I've run into this wonderful error while creating new AppDomains in .NET and then trying to load types and access them across App Domains. In almost all cases the problem I've run into with this error the problem comes from the two AppDomains involved loading different copies of the same type. Unless the types match exactly and come exactly from the same assembly the typecast will fail. The most common scenario is that the types are loaded from different assemblies - as unlikely as that sounds. An Example of Failure To give some context, I'm working on some old code in Html Help Builder that creates a new AppDomain in order to parse assembly information for documentation purposes. I create a new AppDomain in order to load up an assembly process it and then immediately unload it along with the AppDomain. The AppDomain allows for unloading that otherwise wouldn't be possible as well as isolating my code from the assembly that's being loaded. The process to accomplish this is fairly established and I use it for lots of applications that use add-in like functionality - basically anywhere where code needs to be isolated and have the ability to be unloaded. My pattern for this is: Create a new AppDomain Load a Factory Class into the AppDomain Use the Factory Class to load additional types from the remote domain Here's the relevant code from my TypeParserFactory that creates a domain and then loads a specific type - TypeParser - that is accessed cross-AppDomain in the parent domain:public class TypeParserFactory : System.MarshalByRefObject,IDisposable { …/// <summary> /// TypeParser Factory method that loads the TypeParser /// object into a new AppDomain so it can be unloaded. /// Creates AppDomain and creates type. /// </summary> /// <returns></returns> public TypeParser CreateTypeParser() { if (!CreateAppDomain(null)) return null; /// Create the instance inside of the new AppDomain /// Note: remote domain uses local EXE's AppBasePath!!! TypeParser parser = null; try { Assembly assembly = Assembly.GetExecutingAssembly(); string assemblyPath = Assembly.GetExecutingAssembly().Location; parser = (TypeParser) this.LocalAppDomain.CreateInstanceFrom(assemblyPath, typeof(TypeParser).FullName).Unwrap(); } catch (Exception ex) { this.ErrorMessage = ex.GetBaseException().Message; return null; } return parser; } private bool CreateAppDomain(string lcAppDomain) { if (lcAppDomain == null) lcAppDomain = "wwReflection" + Guid.NewGuid().ToString().GetHashCode().ToString("x"); AppDomainSetup setup = new AppDomainSetup(); // *** Point at current directory setup.ApplicationBase = AppDomain.CurrentDomain.BaseDirectory; //setup.PrivateBinPath = Path.Combine(AppDomain.CurrentDomain.BaseDirectory, "bin"); this.LocalAppDomain = AppDomain.CreateDomain(lcAppDomain,null,setup); // Need a custom resolver so we can load assembly from non current path AppDomain.CurrentDomain.AssemblyResolve += new ResolveEventHandler(CurrentDomain_AssemblyResolve); return true; } …} Note that the classes must be either [Serializable] (by value) or inherit from MarshalByRefObject in order to be accessible remotely. Here I need to call methods on the remote object so all classes are MarshalByRefObject. The specific problem code is the loading up a new type which points at an assembly that visible both in the current domain and the remote domain and then instantiates a type from it. This is the code in question:Assembly assembly = Assembly.GetExecutingAssembly(); string assemblyPath = Assembly.GetExecutingAssembly().Location; parser = (TypeParser) this.LocalAppDomain.CreateInstanceFrom(assemblyPath, typeof(TypeParser).FullName).Unwrap(); The last line of code is what blows up with the Unable to cast transparent proxy to type <type> error. Without the cast the code actually returns a TransparentProxy instance, but the cast is what blows up. In other words I AM in fact getting a TypeParser instance back but it can't be cast to the TypeParser type that is loaded in the current AppDomain. Finding the Problem To see what's going on I tried using the .NET 4.0 dynamic type on the result and lo and behold it worked with dynamic - the value returned is actually a TypeParser instance: Assembly assembly = Assembly.GetExecutingAssembly(); string assemblyPath = Assembly.GetExecutingAssembly().Location; object objparser = this.LocalAppDomain.CreateInstanceFrom(assemblyPath, typeof(TypeParser).FullName).Unwrap(); // dynamic works dynamic dynParser = objparser; string info = dynParser.GetVersionInfo(); // method call works // casting fails parser = (TypeParser)objparser; So clearly a TypeParser type is coming back, but nevertheless it's not the right one. Hmmm… mysterious.Another couple of tries reveal the problem however:// works dynamic dynParser = objparser; string info = dynParser.GetVersionInfo(); // method call works // c:\wwapps\wwhelp\wwReflection20.dll (Current Execution Folder) string info3 = typeof(TypeParser).Assembly.CodeBase; // c:\program files\vfp9\wwReflection20.dll (my COM client EXE's folder) string info4 = dynParser.GetType().Assembly.CodeBase; // fails parser = (TypeParser)objparser; As you can see the second value is coming from a totally different assembly. Note that this is even though I EXPLICITLY SPECIFIED an assembly path to load the assembly from! Instead .NET decided to load the assembly from the original ApplicationBase folder. Ouch! How I actually tracked this down was a little more tedious: I added a method like this to both the factory and the instance types and then compared notes:public string GetVersionInfo() { return ".NET Version: " + Environment.Version.ToString() + "\r\n" + "wwReflection Assembly: " + typeof(TypeParserFactory).Assembly.CodeBase.Replace("file:///", "").Replace("/", "\\") + "\r\n" + "Assembly Cur Dir: " + Directory.GetCurrentDirectory() + "\r\n" + "ApplicationBase: " + AppDomain.CurrentDomain.SetupInformation.ApplicationBase + "\r\n" + "App Domain: " + AppDomain.CurrentDomain.FriendlyName + "\r\n"; } For the factory I got: .NET Version: 4.0.30319.239wwReflection Assembly: c:\wwapps\wwhelp\bin\wwreflection20.dllAssembly Cur Dir: c:\wwapps\wwhelpApplicationBase: C:\Programs\vfp9\App Domain: wwReflection534cfa1f For the instance type I got: .NET Version: 4.0.30319.239wwReflection Assembly: C:\\Programs\\vfp9\wwreflection20.dllAssembly Cur Dir: c:\\wwapps\\wwhelpApplicationBase: C:\\Programs\\vfp9\App Domain: wwDotNetBridge_56006605 which clearly shows the problem. You can see that both are loading from different appDomains but the each is loading the assembly from a different location. Probably a better solution yet (for ANY kind of assembly loading problem) is to use the .NET Fusion Log Viewer to trace assembly loads.The Fusion viewer will show a load trace for each assembly loaded and where it's looking to find it. Here's what the viewer looks like: The last trace above that I found for the second wwReflection20 load (the one that is wonky) looks like this:*** Assembly Binder Log Entry (1/13/2012 @ 3:06:49 AM) *** The operation was successful. Bind result: hr = 0x0. The operation completed successfully. Assembly manager loaded from: C:\Windows\Microsoft.NET\Framework\V4.0.30319\clr.dll Running under executable c:\programs\vfp9\vfp9.exe --- A detailed error log follows. === Pre-bind state information === LOG: User = Ras\ricks LOG: DisplayName = wwReflection20, Version=4.61.0.0, Culture=neutral, PublicKeyToken=null (Fully-specified) LOG: Appbase = file:///C:/Programs/vfp9/ LOG: Initial PrivatePath = NULL LOG: Dynamic Base = NULL LOG: Cache Base = NULL LOG: AppName = vfp9.exe Calling assembly : (Unknown). === LOG: This bind starts in default load context. LOG: Using application configuration file: C:\Programs\vfp9\vfp9.exe.Config LOG: Using host configuration file: LOG: Using machine configuration file from C:\Windows\Microsoft.NET\Framework\V4.0.30319\config\machine.config. LOG: Policy not being applied to reference at this time (private, custom, partial, or location-based assembly bind). LOG: Attempting download of new URL file:///C:/Programs/vfp9/wwReflection20.DLL. LOG: Assembly download was successful. Attempting setup of file: C:\Programs\vfp9\wwReflection20.dll LOG: Entering run-from-source setup phase. LOG: Assembly Name is: wwReflection20, Version=4.61.0.0, Culture=neutral, PublicKeyToken=null LOG: Binding succeeds. Returns assembly from C:\Programs\vfp9\wwReflection20.dll. LOG: Assembly is loaded in default load context. WRN: The same assembly was loaded into multiple contexts of an application domain: WRN: Context: Default | Domain ID: 2 | Assembly Name: wwReflection20, Version=4.61.0.0, Culture=neutral, PublicKeyToken=null WRN: Context: LoadFrom | Domain ID: 2 | Assembly Name: wwReflection20, Version=4.61.0.0, Culture=neutral, PublicKeyToken=null WRN: This might lead to runtime failures. WRN: It is recommended to inspect your application on whether this is intentional or not. WRN: See whitepaper http://go.microsoft.com/fwlink/?LinkId=109270 for more information and common solutions to this issue. Notice that the fusion log clearly shows that the .NET loader makes no attempt to even load the assembly from the path I explicitly specified. Remember your Assembly Locations As mentioned earlier all failures I've seen like this ultimately resulted from different versions of the same type being available in the two AppDomains. At first sight that seems ridiculous - how could the types be different and why would you have multiple assemblies - but there are actually a number of scenarios where it's quite possible to have multiple copies of the same assembly floating around in multiple places. If you're hosting different environments (like hosting the Razor Engine, or ASP.NET Runtime for example) it's common to create a private BIN folder and it's important to make sure that there's no overlap of assemblies. In my case of Html Help Builder the problem started because I'm using COM interop to access the .NET assembly and the above code. COM Interop has very specific requirements on where assemblies can be found and because I was mucking around with the loader code today, I ended up moving assemblies around to a new location for explicit loading. The explicit load works in the main AppDomain, but failed in the remote domain as I showed. The solution here was simple enough: Delete the extraneous assembly which was left around by accident. Not a common problem, but one that when it bites is pretty nasty to figure out because it seems so unlikely that types wouldn't match. I know I've run into this a few times and writing this down hopefully will make me remember in the future rather than poking around again for an hour trying to debug the issue as I did today. Hopefully it'll save some of you some time as well in the future.© Rick Strahl, West Wind Technologies, 2005-2012Posted in .NET  COM   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • CodePlex Daily Summary for Monday, November 22, 2010

    CodePlex Daily Summary for Monday, November 22, 2010Popular ReleasesSQL Monitor: SQLMon 1.1: changes: 1.added sql job monitoring; 2.added settings save/loadASP.NET MVC Project Awesome (jQuery Ajax helpers): 1.3.1 and demos: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form and Pager tested on mozilla, safari, chrome, opera, ie 9b/8/7/6DotSpatial: DotSpatial 11-21-2010: This release introduces the following Fixed bugs related to dispose, which caused issues when reordering layers in the legend Fixed bugs related to assigning categories where NULL values are in the fields New fast-acting resize using a bitmap "prediction" of what the final resize content will look like. ImageData.ReadBlock, ImageData.WriteBlock These allow direct file access for reading or writing a rectangular window. Bitmaps are used for holding the values. Removed the need to stor...Minemapper - dynamic mapping for Windows: Minemapper v0.1.0: Pan by: dragging the mouse using the buttons Zoom by: scrolling the mouse wheel using the buttons using the slider Night support Biome support Skylight support Direction support: East West Height slicingMDownloader: MDownloader-0.15.24.6966: Fixed Updater; Fixed minor bugs;WPF Application Framework (WAF): WPF Application Framework (WAF) 2.0.0.1: Version: 2.0.0.1 (Milestone 1): This release contains the source code of the WPF Application Framework (WAF) and the sample applications. Requirements .NET Framework 4.0 (The package contains a solution file for Visual Studio 2010) The unit test projects require Visual Studio 2010 Professional Remark The sample applications are using Microsoft’s IoC container MEF. However, the WPF Application Framework (WAF) doesn’t force you to use the same IoC container in your application. You can use ...Smith Html Editor: Smith Html Editor V0.75: The first public release.MiniTwitter: 1.59: MiniTwitter 1.59 ???? ?? User Streams ????????????????? ?? ?????????????? ???????? ?????????????.NET Extensions - Extension Methods Library for C# and VB.NET: Release 2011.01: Added new extensions for - object.CountLoopsToNull Added new extensions for DateTime: - DateTime.IsWeekend - DateTime.AddWeeks Added new extensions for string: - string.Repeat - string.IsNumeric - string.ExtractDigits - string.ConcatWith - string.ToGuid - string.ToGuidSave Added new extensions for Exception: - Exception.GetOriginalException Added new extensions for Stream: - Stream.Write (overload) And other new methods ... Release as of dotnetpro 01/2011Code Sample from Microsoft: Visual Studio 2010 Code Samples 2010-11-19: Code samples for Visual Studio 2010Prism Training Kit: Prism Training Kit 4.0: Release NotesThis is an updated version of the Prism training Kit that targets Prism 4.0 and added labs for some of the new features of Prism 4.0. This release consists of a Training Kit with Labs on the following topics Modularity Dependency Injection Bootstrapper UI Composition Communication MEF Navigation Note: Take into account that this is a Beta version. If you find any bugs please report them in the Issue Tracker PrerequisitesVisual Studio 2010 Microsoft Word 2...Free language translator and file converter: Free Language Translator 2.2: Starting with version 2.0, the translator encountered a major redesign that uses MEF based plugins and .net 4.0. I've also fixed some bugs and added support for translating subtitles that can show up in video media players. Version 2.1 shows the context menu 'Translate' in Windows Explorer on right click. Version 2.2 has links to start the media file with its associated subtitle. Download the zip file and expand it in a temporary location on your local disk. At a minimum , you should uninstal...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.4 Released: Hi, Today we are releasing Visifire 3.6.4 with few bug fixes: * Multi-line Labels were getting clipped while exploding last DataPoint in Funnel and Pyramid chart. * ClosestPlotDistance property in Axis was not behaving as expected. * In DateTime Axis, Chart threw exception on mouse click over PlotArea if there were no DataPoints present in Chart. * ToolTip was not disappearing while changing the DataSource property of the DataSeries at real-time. * Chart threw exception ...Microsoft SQL Server Product Samples: Database: AdventureWorks 2008R2 SR1: Sample Databases for Microsoft SQL Server 2008R2 (SR1)This release is dedicated to the sample databases that ship for Microsoft SQL Server 2008R2. See Database Prerequisites for SQL Server 2008R2 for feature configurations required for installing the sample databases. See Installing SQL Server 2008R2 Databases for step by step installation instructions. The SR1 release contains minor bug fixes to the installer used to create the sample databases. There are no changes to the databases them...VidCoder: 0.7.2: Fixed duplicated subtitles when running multiple encodes off of the same title.Craig's Utility Library: Craig's Utility Library Code 2.0: This update contains a number of changes, added functionality, and bug fixes: Added transaction support to SQLHelper. Added linked/embedded resource ability to EmailSender. Updated List to take into account new functions. Added better support for MAC address in WMI classes. Fixed Parsing in Reflection class when dealing with sub classes. Fixed bug in SQLHelper when replacing the Command that is a select after doing a select. Fixed issue in SQL Server helper with regard to generati...MFCMAPI: November 2010 Release: Build: 6.0.0.1023 Full release notes at SGriffin's blog. If you just want to run the tool, get the executable. If you want to debug it, get the symbol file and the source. The 64 bit build will only work on a machine with Outlook 2010 64 bit installed. All other machines should use the 32 bit build, regardless of the operating system. Facebook BadgeDotNetNuke® Community Edition: 05.06.00: Major HighlightsAdded automatic portal alias creation for single portal installs Updated the file manager upload page to allow user to upload multiple files without returning to the file manager page. Fixed issue with Event Log Email Notifications. Fixed issue where Telerik HTML Editor was unable to upload files to secure or database folder. Fixed issue where registration page is not set correctly during an upgrade. Fixed issue where Sendmail stripped HTML and Links from emails...mVu Mobile Viewer: mVu Mobile Viewer 0.7.10.0: Tube8 fix.EPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.8.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the serverNew Features Improved chart support Different chart-types series on the same chart Support for secondary axis and a lot of new properties Better styling Encryption and Workbook protection Table support Import csv files Array formulas ...and a lot of bugfixesNew Projects.NET 4 Workflow Activities for Citrix: .NET 4 based workflow activities targeting the Citrix infrastructure.Age calculator: It calculates the age of a person in days on specification of date of birth.Another Azure Demo Project: An Azure demo project - based on the one we (Johan Danforth and Dag König) showed on the Swedish Azure Summit.ASP.NET Layered Web Application: N-Layered Web Applications with ASP.NET based on the article by Imar Spaanjaars.Binzlog: Donet ????。Build Solution: Buid Visual Studio applications with .Net code.CondominioOnline: Projeto para o desenvolvimento colaborativo dos diagramas de desenvolvimento.Create Dynamic UI with WPF: Create Dynamic UI with WPFDNN Fanbox: dot net nuke plugin facebook fanboxDNN Tweet: DNN Tweet is a twitter plugin for DotnetNuke DotNetNuke Notes: dnnNotes allows you to create simple notes that are stored on your DotNetNuke site.Easy Login PHP Script: Give your site a professional looking Members Area with this completely FREE and easy-to-use PHP script! Developed in PHP and uses MySQL as a database backend. Go on, click here, you know you want to! :DFind Nigerian Traditional Fashion Styles: NaijaTradStyles is a social network for Nigerians all over the world to promote the Nigerian economy, designs and cultures, fashion designers and individuals. This site allows users to share fashion ideas, activities, events, and interests within their individual networks. The GreenArrow: Just a simple mark-locate-click automation tool by comparing graphic pieces. GreenArrow makes it easier for automation script writer to handle UI elements which cannot be located by normal methods, like keyword or classid. Libero API for Fusion Charts in ASP.Net: Libero.FusionChartsAPI is made for Asp.Net (Webforms and MVC) developers to make easier to implement Fusion Charts in their projects. It is developed in framework .Net 4 (but supports framework 3.5) to target ASP.Net projects. Minemapper - dynamic mapping for Windows: Minemapper is an interactive, dynamic mapper for Minecraft. It uses mcmap to generate small map image tiles, then lets you pan and zoom around, quickly generating new tiles as needed.MoodleAzure: Enable Moodle 1.9.9 to run on Windows Azure and SQL AzureOpalis Active Directory Extension: A Opalis Integration Pack Project for Active Directory Integration. Done with C# Directory Services.Quick Finger SDK: Quick Finger SDK helps you to build a wide range of applications to use fingerprint recognition. Quick Finger SDK makes it easier for developers to integrate fingerprint recognition into their software. It's developed in Visual C++. Regex Batch Replacer (Multi-File): Regex Batch Replacer uses regular expression to find and replace text in multiple files.RiverRaid X: A clone of the classic Atari 2600 arcade game, River Raid. Uses XNA 4.0 and Neat game engine (http://neat.codeplex.com)SharePoint Commander: SharePoint 2010 administrative tool for developers and administrators.StreamerMatch: A tool for streamers, focused at Starcraft II at the moment.Tab Web Part: This solution is used to present the WebParts in a tab like user interface. It is tested on a SharePoint 2010 sandboxed solution. With this solution, all the WebParts added in a particular zone will appear in a tab kind of interface in the design mode. The javascript transformsTomato: XNA-based rendering middleware.UnicornObjects: todoVina: VinaWPF Photo/Image Manager: A WPF playground for many projects, including an image viewer, filters, image modification, photo organization, etc.WXQCW: wxqcw news platformYobbo Guitar: Yobbo guitar is a web application developed in ASP.NET that allows users to share guitar songs and chord progressions.

    Read the article

  • Caveats with the runAllManagedModulesForAllRequests in IIS 7/8

    - by Rick Strahl
    One of the nice enhancements in IIS 7 (and now 8) is the ability to be able to intercept non-managed - ie. non ASP.NET served - requests from within ASP.NET managed modules. This opened up a ton of new functionality that could be applied across non-managed content using .NET code. I thought I had a pretty good handle on how IIS 7's Integrated mode pipeline works, but when I put together some samples last tonight I realized that the way that managed and unmanaged requests fire into the pipeline is downright confusing especially when it comes to the runAllManagedModulesForAllRequests attribute. There are a number of settings that can affect whether a managed module receives non-ASP.NET content requests such as static files or requests from other frameworks like PHP or ASP classic, and this is topic of this blog post. Native and Managed Modules The integrated mode IIS pipeline for IIS 7 and later - as the name suggests - allows for integration of ASP.NET pipeline events in the IIS request pipeline. Natively IIS runs unmanaged code and there are a host of native mode modules that handle the core behavior of IIS. If you set up a new IIS site or application without managed code support only the native modules are supported and fired without any interaction between native and managed code. If you use the Integrated pipeline with managed code enabled however things get a little more confusing as there both native modules and .NET managed modules can fire against the same IIS request. If you open up the IIS Modules dialog you see both managed and unmanaged modules. Unmanaged modules point at physical files on disk, while unmanaged modules point at .NET types and files referenced from the GAC or the current project's BIN folder. Both native and managed modules can co-exist and execute side by side and on the same request. When running in IIS 7 the IIS pipeline actually instantiates a the ASP.NET  runtime (via the System.Web.PipelineRuntime class) which unlike the core HttpRuntime classes in ASP.NET receives notification callbacks when IIS integrated mode events fire. The IIS pipeline is smart enough to detect whether managed handlers are attached and if they're none these notifications don't fire, improving performance. The good news about all of this for .NET devs is that ASP.NET style modules can be used for just about every kind of IIS request. All you need to do is create a new Web Application and enable ASP.NET on it, and then attach managed handlers. Handlers can look at ASP.NET content (ie. ASPX pages, MVC, WebAPI etc. requests) as well as non-ASP.NET content including static content like HTML files, images, javascript and css resources etc. It's very cool that this capability has been surfaced. However, with that functionality comes a lot of responsibility. Because every request passes through the ASP.NET pipeline if managed modules (or handlers) are attached there are possible performance implications that come with it. Running through the ASP.NET pipeline does add some overhead. ASP.NET and Your Own Modules When you create a new ASP.NET project typically the Visual Studio templates create the modules section like this: <system.webServer> <validation validateIntegratedModeConfiguration="false" /> <modules runAllManagedModulesForAllRequests="true" > </modules> </system.webServer> Specifically the interesting thing about this is the runAllManagedModulesForAllRequest="true" flag, which seems to indicate that it controls whether any registered modules always run, even when the value is set to false. Realistically though this flag does not control whether managed code is fired for all requests or not. Rather it is an override for the preCondition flag on a particular handler. With the flag set to the default true setting, you can assume that pretty much every IIS request you receive ends up firing through your ASP.NET module pipeline and every module you have configured is accessed even by non-managed requests like static files. In other words, your module will have to handle all requests. Now so far so obvious. What's not quite so obvious is what happens when you set the runAllManagedModulesForAllRequest="false". You probably would expect that immediately the non-ASP.NET requests no longer get funnelled through the ASP.NET Module pipeline. But that's not what actually happens. For example, if I create a module like this:<add name="SharewareModule" type="HowAspNetWorks.SharewareMessageModule" /> by default it will fire against ALL requests regardless of the runAllManagedModulesForAllRequests flag. Even if the value runAllManagedModulesForAllRequests="false", the module is fired. Not quite expected. So what is the runAllManagedModulesForAllRequests really good for? It's essentially an override for managedHandler preCondition. If I declare my handler in web.config like this:<add name="SharewareModule" type="HowAspNetWorks.SharewareMessageModule" preCondition="managedHandler" /> and the runAllManagedModulesForAllRequests="false" my module only fires against managed requests. If I switch the flag to true, now my module ends up handling all IIS requests that are passed through from IIS. The moral of the story here is that if you intend to only look at ASP.NET content, you should always set the preCondition="managedHandler" attribute to ensure that only managed requests are fired on this module. But even if you do this, realize that runAllManagedModulesForAllRequests="true" can override this setting. runAllManagedModulesForAllRequests and Http Application Events Another place the runAllManagedModulesForAllRequest attribute affects is the Global Http Application object (typically in global.asax) and the Application_XXXX events that you can hook up there. So while the events there are dynamically hooked up to the application class, they basically behave as if they were set with the preCodition="managedHandler" configuration switch. The end result is that if you have runAllManagedModulesForAllRequests="true" you'll see every Http request passed through the Application_XXXX events, and you only see ASP.NET requests with the flag set to "false". What's all that mean? Configuring an application to handle requests for both ASP.NET and other content requests can be tricky especially if you need to mix modules that might require both. Couple of things are important to remember. If your module doesn't need to look at every request, by all means set a preCondition="managedHandler" on it. This will at least allow it to respond to the runAllManagedModulesForAllRequests="false" flag and then only process ASP.NET requests. Look really carefully to see whether you actually need runAllManagedModulesForAllRequests="true" in your applications as set by the default new project templates in Visual Studio. Part of the reason, this is the default because it was required for the initial versions of IIS 7 and ASP.NET 2 in order to handle MVC extensionless URLs. However, if you are running IIS 7 or later and .NET 4.0 you can use the ExtensionlessUrlHandler instead to allow you MVC functionality without requiring runAllManagedModulesForAllRequests="true": <handlers> <remove name="ExtensionlessUrlHandler-Integrated-4.0" /> <add name="ExtensionlessUrlHandler-Integrated-4.0" path="*." verb="GET,HEAD,POST,DEBUG,PUT,DELETE,PATCH,OPTIONS" type="System.Web.Handlers.TransferRequestHandler" preCondition="integratedMode,runtimeVersionv4.0" /> </handlers> Oddly this is the default for Visual Studio 2012 MVC template apps, so I'm not sure why the default template still adds runAllManagedModulesForAllRequests="true" is - it should be enabled only if there's a specific need to access non ASP.NET requests. As a side note, it's interesting that when you access a static HTML resource, you can actually write into the Response object and get the output to show, which is trippy. I haven't looked closely to see how this works - whether ASP.NET just fires directly into the native output stream or whether the static requests are re-routed directly through the ASP.NET pipeline once a managed code module is detected. This doesn't work for all non ASP.NET resources - for example, I can't do the same with ASP classic requests, but it makes for an interesting demo when injecting HTML content into a static HTML page :-) Note that on the original Windows Server 2008 and Vista (IIS 7.0) you might need a HotFix in order for ExtensionLessUrlHandler to work properly for MVC projects. On my live server I needed it (about 6 months ago), but others have observed that the latest service updates have integrated this functionality and the hotfix is not required. On IIS 7.5 and later I've not needed any patches for things to just work. Plan for non-ASP.NET Requests It's important to remember that if you write a .NET Module to run on IIS 7, there's no way for you to prevent non-ASP.NET requests from hitting your module. So make sure you plan to support requests to extensionless URLs, to static resources like files. Luckily ASP.NET creates a full Request and full Response object for you for non ASP.NET content. So even for static files and even for ASP classic for example, you can look at Request.FilePath or Request.ContentType (in post handler pipeline events) to determine what content you are dealing with. As always with Module design make sure you check for the conditions in your code that make the module applicable and if a filter fails immediately exit - minimize the code that runs if your module doesn't need to process the request.© Rick Strahl, West Wind Technologies, 2005-2012Posted in IIS7   ASP.NET   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

< Previous Page | 680 681 682 683 684 685 686 687 688 689 690 691  | Next Page >