Search Results

Search found 13303 results on 533 pages for 'position fixed'.

Page 513/533 | < Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >

  • Quick Quips on QR Codes

    - by Tim Dexter
    Yes, I'm an alliterating all-star; I missed my calling as a newspaper headline writer. I have recently received questions from several folks on support for QR codes. You know them they are everywhere you look, even here! How does Publisher handle QR codes then? In theory, exactly the same way we handle any other 2D barcode font. We need the font file, a mapping entry and an encoding class. With those three pieces we can embed QR codes into any output. To test the theory, I went off to IDAutomation, I have worked with them and many customers over the years and their fonts and encoders have worked great and have been very reliable. They kindly provide demo fonts which has made my life so much easier to be able to write posts like this. Their QR font and encoder is a little tough to find. I started here and then hit the Demo Now button. On the next page I hit the right hand Demo Now button. In the resulting zip file you'll need two files: AdditionalFonts.zip >> Automation2DFonts >> TrueType >> IDAutomation2D.ttf Java Class Encoder >> IDAutomation_JavaFontEncoder_QRCode.jar - the QRBarcodeExample.java is useful to see how to call the encoder. The font file needs to be installed into the windows/fonts directory, just copy and paste it in using file explorer and windows will install it for you. Remember, we are using the demo font here and you'll see if you get your phones decoder to looks a the font above there is a fixed string 'DEMO' at the beginning. You want that removed? Go buy the font from the IDAutomation folks. The Encoder Next you need to create your encoding wrapper class. Publisher does ship a class but its compiled and I do not recommend trying to modify it, you can just build your own. I have loaded up my class here. You do not need to be a java guru, its pretty straightforward. I'd recommend a java IDE like JDeveloper from a convenience point of view. I have annotated my class and added a main method to it so you can test your encoders from JDeveloper without having to deploy them first. You can load up the project form the zip file straight into JDeveloper.Next, take a look at IDAutomation's example java class and you'll see: QRCodeEncoder qre=new QRCodeEncoder();  String DataToEncode = "IDAutmation Inc.";  boolean ApplyTilde = false;  int EncodingMode = 0;  int Version = 0;  int ErrorCorrectionLevel = 0;  System.out.println( qre.FontEncode(DataToEncode, ApplyTilde, EncodingMode, Version, ErrorCorrectionLevel) ); You'll need to check what settings you need to set for the ApplyTilde, EncodingMode, Version and ErrorCorrectionLevel. They are covered in the user guide from IDAutomation here. If you do not want to hard code the values in the encoder then you can quite easily externalize them and read the values from a text file. I have not covered that scenario here, I'm going with IDAutomation's defaults and my phone app is reading the fonts no problem. Now you know how to call the encoder, you need to incorporate it into your encoder wrapper class. From my sample class:       Class[] clazz = new Class[] { "".getClass() };        ENCODERS.put("code128a",mUtility.getClass().getMethod("code128a", clazz));       ENCODERS.put("code128b",mUtility.getClass().getMethod("code128b", clazz));       ENCODERS.put("code128c",mUtility.getClass().getMethod("code128c", clazz));       ENCODERS.put("qrcode",mUtility.getClass().getMethod("qrcode", clazz)); I just added a new entry to register the encoder method 'qrcode' (in red). Then I created a new method inside the class to call the IDAutomation encoder. /** Call to IDAutomations QR Code encoder. Passing the data to encode      Returning the encoded string to the template for formatting **/ public static final String qrcode (String DataToEncode) {   QRCodeEncoder qre=new QRCodeEncoder();    boolean ApplyTilde = false;    int EncodingMode = 0;    int Version = 0;    int ErrorCorrectionLevel = 0; return qre.FontEncode(DataToEncode, ApplyTilde, EncodingMode, Version, ErrorCorrectionLevel); } Almost the exact same code in their sample class. The DataToEncode string is passed in rather than hardcoded of course. With the class done you can now compile it, but you need to ensure that the IDAutomation_JavaFontEncoder_QRCode.jar is in the classpath. In JDeveloper, open the project properties >> Libraries and Classpaths and then add the jar to the list. You'll need the publisher jars too. You can find those in the jlib directory in your Template Builder for Word directory.Note! In my class, I have used package oracle.psbi.barcode; As my package spec, yours will be different but you need to note it for later. Once you have it compiling without errors you will need to generate a jar file to keep it in. In JDeveloper highlight your project node >> New >> Deployment Profile >> JAR file. Once you have created the descriptor, just take the defaults. It will tell you where the jar is located. Go get it and then its time to copy it and the IDAutomation jar into the Template Builder for Word directory structure. Deploying the jars On your windows machine locate the jlib directory under the Template Builder for Word install directory. On my machine its here, F:\Program Files\Oracle\BI Publisher\BI Publisher Desktop\Template Builder for Word\jlib. Copy both of the jar files into the directory. The next step is to get the jars into the classpath for the Word plugin so that Publisher can find your wrapper class and it can then find the IDAutomation encoder. The most consistent way I have found so far, is to open up the RTF2PDF.jar in the same directory and make some mods. First make a backup of the jar file then open it using winzip or 7zip or similar and get into the META-INF directory. In there is a file, MANIFEST.MF. This contains the classpath for the plugin, open it in an editor and add the jars to the end of the classpath list. In mine I have: Manifest-Version: 1.0 Class-Path: ./activation.jar ./mail.jar ./xdochartstyles.jar ./bicmn.jar ./jewt4.jar ./share.jar ./bipres.jar ./xdoparser.jar ./xdocore.jar ./xmlparserv2.jar ./xmlparserv2-904.jar  ./i18nAPI_v3.jar ./versioninfo.jar ./barcodejar.jar ./IDAutomation_JavaFontEncoder_QRCode.jar Main-Class: RTF2PDF I have put in carriage returns above to make the Class-Path: entry more readable, make sure yours is all on one line. Be sure to use the ./ as a prefix to the jar name. Ensure the file is saved inside the jar file 7zip and winzip both have popups asking if you want to update the file in the jar file.Now you have the jars on the classpath, the Publisher plugin will be able to find our classes at run time. Referencing the Font The next step is to reference the font location so that the rendering engine can find it and embed a subset into the PDF output. Remember the other output formats rely on the font being present on the machine that is opening the document. The PDF is the only truly portable format. Inside the config directory under the Template Builder for Word install directory, mine is here, F:\Program Files\Oracle\BI Publisher\BI Publisher Desktop\Template Builder for Word\config. You'll find the file, 'xdo example.cfg'. Rename it to xdo.cfg and open it in a text editor. In the fonts section, create a new entry:       <font family="IDAutomation2D" style="normal" weight="normal">              <truetype path="C:\windows\fonts\IDAutomation2D.ttf" />       </font> Note, 'IDAutomation2D' (in red) is the same name as you can see when you open MSWord and look for the QRCode font. This must match exactly. When Publisher looks at the fonts in the RTF template at runtime it will see 'IDAutomation2D' it will then look at its font mapping entries to find where that font file resides on the disk. If the names do not match or the font is not present then the font will not get used and it will fall back on Helvetica. Building the Template Now you have the data encoder and the font in place and mapped; you can use it in the template. The two commands you will need to have present are: <?register-barcode-vendor:'ENCODER WRAPPER CLASS'; 'ENCODER NAME'?> for my encoder I have: <?register-barcode-vendor:'oracle.psbi.barcode.BarcodeUtil'; 'MyBarcodeEncoder'?> Notice the two parameters for the command. The first provides the package 'path' and class name (remember I said you need to remember that above.)The second is the name of the encoder, in my case 'MyBarcodeEncoder'. Check my full encoder class in the zip linked below to see where I named it. You can change it to something else, no problem.This command needs to be near the top of the template. The second command is the encoding command: <?format-barcode:DATAT_TO_ENCODE;'ENCODER_METHOD_NAME';'ENCODER_NAME'?> for my command I have <?format-barcode:DATATEXT;'qrcode';'MyBarcodeEncoder'?>DATATEXT is the XML element that contains the text to be encoded. If you want to hard code a piece of text just surround it with single quotes. qrcode is the name of my encoder method that calls the IDAutomation encoder. Remember this.MyBarcodeEncoder is the name of my encoder. Repetition? Yes but its needed again. Both of these commands are put inside their own form fields. Do not apply the QRCode font to the second field just yet. Lets make sure the encoder is working. Run you template with some data and you should get something like this for your encoded data: AHEEEHAPPJOPMOFADIPFJKDCLPAHEEEHA BNFFFNBPJGMDIDJPFOJGIGBLMPBNFFFNB APIBOHFJCFBNKHGGBMPFJFJLJBKGOMNII OANKPJFFLEPLDNPCLMNGNIJIHFDNLJFEH FPLFLHFHFILKFBLOIGMDFCFLGJGOPJJME CPIACDFJPBGDODOJCHALJOBPECKMOEDDF MFFNFNEPKKKCHAIHCHPCFFLDAHFHAGLMK APBBBPAPLDKNKJKKGIPDLKGMGHDDEPHLN HHHHHHHPHPHHPHPPHPPPPHHPHHPHPHPHP Grooovy huh? If you do not get the encoded text then go back and check that your jars are in the right spot and that you have the MANIFEST.MF file updated correctly. Once you do get the encoded text, highlight the field and apply the IDAutomation2D font to it. Then re-run the report and you will hopefully see the QR code in your output. If not, go back and check the xdo.cfg entry and make sure its in the right place and the font location is correct. That's it, you now have QR codes in Publisher outputs. Everything I have written above, has been tested with the 5.6.3, 10.1.3.4.2 codelines. I'll be testing the 11g code in the next day or two and will update you with any changes. One thing I have not covered yet and will do in the next few days is how to deploy all of this to your server. Look out for a follow up post. One note on the apparent white lines in the font (see the image above). Once printed they disappear and even viewing the code on a screen with the white lines, my phone app is still able to read and interpret the contents no problem. I have zipped up my encoder wrapper class as a JDeveloper 11.1.1.6 project here. Just dig into the src directories to find the BarcodeUtil.java file if you just want the code. I have put comments into the file to hopefully help the novice java programmer out. Happy QR'ing!

    Read the article

  • Collide with rotation of the object

    - by Lahiru
    I'm developing a mirror for lazer beam(Ball sprite). There I'm trying to redirect the laze beam according to the ration degree of the mirror(Rectangle). How can I collide the ball to the correct angle if the colliding object is with some angle(45 deg) rather than colliding back. here is an screen shot of my work here is my code using System; using System.Collections.Generic; using System.Linq; using Microsoft.Xna.Framework; using Microsoft.Xna.Framework.Audio; using Microsoft.Xna.Framework.Content; using Microsoft.Xna.Framework.GamerServices; using Microsoft.Xna.Framework.Graphics; using Microsoft.Xna.Framework.Input; using Microsoft.Xna.Framework.Media; namespace collision { /// <summary> /// This is the main type for your game /// </summary> public class Game1 : Microsoft.Xna.Framework.Game { GraphicsDeviceManager graphics; SpriteBatch spriteBatch; Texture2D ballTexture; Rectangle ballBounds; Vector2 ballPosition; Vector2 ballVelocity; float ballSpeed = 30f; Texture2D blockTexture; Rectangle blockBounds; Vector2 blockPosition; private Vector2 origin; KeyboardState keyboardState; //Font SpriteFont Font1; Vector2 FontPos; private String displayText; public Game1() { graphics = new GraphicsDeviceManager(this); Content.RootDirectory = "Content"; } /// <summary> /// Allows the game to perform any initialization it needs to before starting to run. /// This is where it can query for any required services and load any non-graphic /// related content. Calling base.Initialize will enumerate through any components /// and initialize them as well. /// </summary> protected override void Initialize() { // TODO: Add your initialization logic here ballPosition = new Vector2(this.GraphicsDevice.Viewport.Width / 2, this.GraphicsDevice.Viewport.Height * 0.25f); blockPosition = new Vector2(this.GraphicsDevice.Viewport.Width / 2, this.GraphicsDevice.Viewport.Height /2); ballVelocity = new Vector2(0, 1); base.Initialize(); } /// <summary> /// LoadContent will be called once per game and is the place to load /// all of your content. /// </summary> protected override void LoadContent() { // Create a new SpriteBatch, which can be used to draw textures. spriteBatch = new SpriteBatch(GraphicsDevice); ballTexture = Content.Load<Texture2D>("ball"); blockTexture = Content.Load<Texture2D>("mirror"); //create rectangles based off the size of the textures ballBounds = new Rectangle((int)(ballPosition.X - ballTexture.Width / 2), (int)(ballPosition.Y - ballTexture.Height / 2), ballTexture.Width, ballTexture.Height); blockBounds = new Rectangle((int)(blockPosition.X - blockTexture.Width / 2), (int)(blockPosition.Y - blockTexture.Height / 2), blockTexture.Width, blockTexture.Height); origin.X = blockTexture.Width / 2; origin.Y = blockTexture.Height / 2; // TODO: use this.Content to load your game content here Font1 = Content.Load<SpriteFont>("SpriteFont1"); FontPos = new Vector2(graphics.GraphicsDevice.Viewport.Width - 100, 20); } /// <summary> /// UnloadContent will be called once per game and is the place to unload /// all content. /// </summary> protected override void UnloadContent() { // TODO: Unload any non ContentManager content here } /// <summary> /// Allows the game to run logic such as updating the world, /// checking for collisions, gathering input, and playing audio. /// </summary> /// <param name="gameTime">Provides a snapshot of timing values.</param> /// private float RotationAngle; float circle = MathHelper.Pi * 2; float angle; protected override void Update(GameTime gameTime) { // Allows the game to exit if (GamePad.GetState(PlayerIndex.One).Buttons.Back == ButtonState.Pressed) this.Exit(); // TODO: Add your update logic here //check for collision between the ball and the block, or if the ball is outside the bounds of the screen if (ballBounds.Intersects(blockBounds) || !GraphicsDevice.Viewport.Bounds.Contains(ballBounds)) { //we have a simple collision! //if it has hit, swap the direction of the ball, and update it's position ballVelocity = -ballVelocity; ballPosition += ballVelocity * ballSpeed; } else { //move the ball a bit ballPosition += ballVelocity * ballSpeed; } //update bounding boxes ballBounds.X = (int)ballPosition.X; ballBounds.Y = (int)ballPosition.Y; blockBounds.X = (int)blockPosition.X; blockBounds.Y = (int)blockPosition.Y; keyboardState = Keyboard.GetState(); float val = 1.568017f/90; if (keyboardState.IsKeyDown(Keys.Space)) RotationAngle = RotationAngle + (float)Math.PI; if (keyboardState.IsKeyDown(Keys.Left)) RotationAngle = RotationAngle - val; angle = (float)Math.PI / 4.0f; // 90 degrees RotationAngle = angle; // RotationAngle = RotationAngle % circle; displayText = RotationAngle.ToString(); base.Update(gameTime); } /// <summary> /// This is called when the game should draw itself. /// </summary> /// <param name="gameTime">Provides a snapshot of timing values.</param> protected override void Draw(GameTime gameTime) { GraphicsDevice.Clear(Color.CornflowerBlue); // TODO: Add your drawing code here spriteBatch.Begin(); // Find the center of the string Vector2 FontOrigin = Font1.MeasureString(displayText) / 2; spriteBatch.DrawString(Font1, displayText, FontPos, Color.White, 0, FontOrigin, 1.0f, SpriteEffects.None, 0.5f); spriteBatch.Draw(ballTexture, ballPosition, Color.White); spriteBatch.Draw(blockTexture, blockPosition,null, Color.White, RotationAngle,origin, 1.0f, SpriteEffects.None, 0f); spriteBatch.End(); base.Draw(gameTime); } } }

    Read the article

  • CodePlex Daily Summary for Wednesday, April 04, 2012

    CodePlex Daily Summary for Wednesday, April 04, 2012Popular ReleasesExtAspNet: ExtAspNet v3.1.2: ExtAspNet - ?? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ?????????? ExtAspNet ????? ExtJS ??? ASP.NET 2.0 ???,????? AJAX ??????????。 ExtAspNet ??????? JavaScript,?? CSS,?? UpdatePanel,?? ViewState,?? WebServices ???????。 ??????: IE 7.0, Firefox 3.6, Chrome 3.0, Opera 10.5, Safari 3.0+ ????:Apache License 2.0 (Apache) ??:http://extasp.net/ ??:http://bbs.extasp.net/ ??:http://extaspnet.codeplex.com/ ??:http://sanshi.cnblogs.com/ ????: +2012-04-04 v3.1.2 -??IE?????????????BUG(??"about:blank"?...totalem: version 2012.04.04.1: devel releaseNShape - .Net Diagramming Framework for Industrial Applications: NShape 2.0.0: Changes in 2.0.0:New / Changed / Extended Interfaces: public interface ISecurityManager public interface ISecurityDomainObject public interface IModelObject : ISecurityDomainObject public interface ILayerCollection : ICollection<Layer> public interface IRepository public interface ICommand public interface IModelMapping For details on interface changes, see documentation. For more changes on namespaces and base classes, see ReadMe.txt (installation folder) or Changes.txt in the s...ClosedXML - The easy way to OpenXML: ClosedXML 0.65.1: Aside from many bug fixes we now have Conditional Formatting The conditional formatting was sponsored by http://www.bewing.nl (big thanks) New on v0.65.1 Fixed issue when loading conditional formatting with default values for icon setsnopCommerce. Open source shopping cart (ASP.NET MVC): nopcommerce 2.50: Highlight features & improvements: • Significant performance optimization. • Allow store owners to create several shipments per order. Added a new shipping status: “Partially shipped”. • Pre-order support added. Enables your customers to place a Pre-Order and pay for the item in advance. Displays “Pre-order” button instead of “Buy Now” on the appropriate pages. Makes it possible for customer to buy available goods and Pre-Order items during one session. It can be managed on a product variant ...SkyDrive Connector for SharePoint: SkyDriveConnector for SharePoint: SkyDriveConnector for SharePointWiX Toolset: WiX v3.6 RC0: WiX v3.6 RC0 (3.6.2803.0) provides support for VS11 and a more stable Burn engine. For more information see Rob's blog post about the release: http://robmensching.com/blog/posts/2012/4/3/WiX-v3.6-Release-Candidate-Zero-availableSageFrame: SageFrame 2.0: Sageframe is an open source ASP.NET web development framework developed using ASP.NET 3.5 with service pack 1 (sp1) technology. It is designed specifically to help developers build dynamic website by providing core functionality common to most web applications.Lightbox Gallery Module for DotNetNuke: 01.11.00: This version of the Lightbox Gallery Module adds the following features/bug fixes: Feature: Read image EXIF information Bug Fix: URL parsing bug for any folder provider Minimum System Requirements Please ensure that your environment meets or exceeds the following system requirements: DotNetNuke® Version 06.00.00 or higher SQL Server 2005 or later .Net Framework 3.5 SP1 or lateriTuner - The iTunes Companion: iTuner 1.5.4475: Fix to parse empty playlists in iTunes LibraryGoogleMap Control: Google Geocoder 1.3: Geo types changes to string type and GeoAddressType enum changes to static class with some known types as constants in order to resolve the problems with new and not defined geo types.Document.Editor: 2012.2: Whats New for Document.Editor 2012.2: New Save Copy support New Page Setup support Minor Bug Fix's, improvements and speed upsVidCoder: 1.3.2: Added option for the minimum title length to scan. Added support to enable or disable LibDVDNav. Added option to prompt to delete source files after clearing successful completed items. Added option to disable remembering recent files and folders. Tweaked number box to only select all on a quick click.MJP's DirectX 11 Samples: Light Indexed Deferred Rendering: Implements light indexed deferred using per-tile light lists calculated in a compute shader, as well as a traditional deferred renderer that uses a compute shader for per-tile light culling and per-pixel shading.Pcap.Net: Pcap.Net 0.9.0 (66492): Pcap.Net - March 2012 Release Pcap.Net is a .NET wrapper for WinPcap written in C++/CLI and C#. It Features almost all WinPcap features and includes a packet interpretation framework. Version 0.9.0 (Change Set 66492)March 31, 2012 release of the Pcap.Net framework. Follow Pcap.Net on Google+Follow Pcap.Net on Google+ Files Pcap.Net.DevelopersPack.0.9.0.66492.zip - Includes all the tutorial example projects source files, the binaries in a 3rdParty directory and the documentation. It include...Extended WPF Toolkit: Extended WPF Toolkit - 1.6.0: Want an easier way to install the Extended WPF Toolkit?The Extended WPF Toolkit is available on Nuget. What's in the 1.6.0 Release?BusyIndicator ButtonSpinner Calculator CalculatorUpDown CheckListBox - Breaking Changes CheckComboBox - New Control ChildWindow CollectionEditor CollectionEditorDialog ColorCanvas ColorPicker DateTimePicker DateTimeUpDown DecimalUpDown DoubleUpDown DropDownButton IntegerUpDown Magnifier MaskedTextBox MessageBox MultiLineTex...Media Companion: MC 3.434b Release: General This release should be the last beta for 3.4xx. If there are no major problems, by the end of the week it will upgraded to 3.500 Stable! The latest mc_com.exe should be included too! TV Bug fix - crash when using XBMC scraper for TV episodes. Bug fix - episode count update when adding new episodes. Bug fix - crash when actors name was missing. Enhanced TV scrape progress text. Enhancements made to missing episodes display. Movies Bug fix - hide "Play Trailer" when multisaev...Better Explorer: Better Explorer 2.0.0.831 Alpha: - A new release with: - many bugfixes - changed icon - added code for more failsafe registry usage on x64 systems - not needed regfix anymore - added ribbon shortcut keys - Other fixes Note: If you have problems opening system libraries, a suggestion was given to copy all of these libraries and then delete the originals. Thanks to Gaugamela for that! (see discussion here: 349015 ) Note2: I was upload again the setup due to missing file!MonoGame - Write Once, Play Everywhere: MonoGame 2.5: The MonoGame team are pleased to announce that MonoGame v2.5 has been released. This release contains important bug fixes, implements optimisations and adds key features. MonoGame now has the capability to use OpenGLES 2.0 on Android and iOS devices, meaning it now supports custom shaders across mobile and desktop platforms. Also included in this release are native orientation animations on iOS devices and better Orientation support for Android. There have also been a lot of bug fixes since t...Circuit Diagram: Circuit Diagram 2.0 Alpha 3: New in this release: Added components: Microcontroller Demultiplexer Flip & rotate components Open XML files from older versions of Circuit Diagram Text formatting for components New CDDX syntax Other fixesNew ProjectsASP.Net MVC Composite Application Framework: ASP.NET MVC Composite Application FrameworkaTimer : Take a break timer: This timer will help programmers and other people who work on computer for long periods, users will be able to set their time period for work and break, whether you want to enforce the break by hibernating or making your computer sleep. Will be good for eyes, a lot! Still under development and will be monthly revised.Avilander Maven: Projeto Disciplina Branquinho.Banner from Databases: make a banner from database informationCrmXpress OptionSet Manager For Microsoft Dynamics CRM 2011: CrmXpress OptionSet Manager For Microsoft Dynamics CRM 2011CSharpCodeLineCounter: Count code lines as Code Metrics do.Decatec Sharepoint code: Various Decatec Sharepoint codeDevWebServer: DevWebServer is a simple, lightweight web server which can be used for unit testing HTTP client applications. It is developed in C#, has minimal dependencies and is provided as a single assembly.Effect Architect: A real-time effect editor (currently) aimed at Direct3D9. The idea is to bring features that popular IDEs provide to HLSL coding, as well as features that are specific to graphical coding.Gpu Hough: Demo project performing Hough transformation on GPUHappyHourWin8: This project contains several Metro-App development approaches. There will be an RSS-Reader, a task list and a game. Home Media Center: Home Media Center is a server application for UPnP / DLNA compatible devices. It supports streaming and transcoding media files, Windows desktop and video from webcams. This project is developed in C#, C++ and uses DirectShow, Media Foundation.INSTEON Mayhem Module: INSTEON module for Mayhem application.ISService: ISServiceKEITH: Kinect for hospitalsKnowledge Exchange: KnowledgeExchangeLightweight Test Automation Framework: The Lightweight Test Automation Framework for ASP.NET was developed and is currently used by the ASP.NET QA Team to automate regression tests for the product. It is designed to run within an ASP.NET application. Tests can be written in any .NET Framework language. They use an APlinewatchdk: linewatchdkLuaForMono: c#?lua?????,??LuaInterface??luaSharp???,?????????(luaInterface)??lua??c#?????????(luasharp),??mono?linux????????lua??.Metro App Toolkit: The Metro App Toolkit provides the developer community with key components and functionality that are commonly used in Metro Apps for WindowsPhone, Windows8 and the web, including Networking! Toolkit releases include samples & docs. From the community, for the community. My IE Accelerators: 1. dict.cn ie accelaratorofm: ofmOrchard Clippy module: Orchard Clippy moduleOrchard Twitter module: Orchard Twitter modulePersian Subtitle For Programming Courses: This project is created for collecting all programming course's PERSIAN SUBTITLE and help Persian language student to learn programming and every thing related to their major, by Meysam Hooshmandpicking: pickingPomodoro Timer for Visual Studio: Keep track of your Pomodoro Technique (from the Pomodoro Technique® official website – www.pomodorotechnique.com) sessions in Visual Studio.ScriptSafe: A simple powershell wrapper around source control aimed at admins to enable simple source control for their scripts.SkyDrive Connector for SharePoint: SkyDrive Connector for SharePoint is a sandbox web part that can help manage documents stored on skydrive on sharepoint by adding additional metadata. This helps the document to be discoverable on SharePoint. Also SkyDrive provides 25GB for free space for live users.SkypeTranslator: Help users to translate incoming message from Skype and Outlook applications to specified language. In core was use Elysium , log4net , and WPF frameworks, WPF Toolkit Extended,TPL,WCFTalking Heads: The TalkingHeads is a simple social network written in ASP .NET 4.0. It is intended for demonstrating and exploring application monitoring features of AVIcode technology. It works great with AVIcode 5.7, as well as with APM feature in System Center 2012. Unik Project: Unik is a Unified Framework makes easier to create enterprise .NET applications based on architecture best practice.WV-CU950 System Controller .NET Library.: WV-CU950 System Controller .NET library for Panasonic WV-CU950 System Controller. This library contains data processing method.YO JACK!: <Project name>Yo Jack!</Project name> Gather evidence to track down a physical theft of your computer, and turn over to the police. Free to the public.???????: ????????

    Read the article

  • Handling HumanTask attachments in Oracle BPM 11g PS4FP+ (II)

    - by ccasares
    Retrieving uploaded attachments -UCM- As stated in my previous blog entry, Oracle BPM 11g 11.1.1.5.1 (aka PS4FP) introduced a new cool feature whereby you can use Oracle WebCenter Content (previously known as Oracle UCM) as the repository for the human task attached documents. For more information about how to use or enable this feature, have a look here. The attachment scope (either TASK or PROCESS) also applies to UCM-attachments. But even with this other feature, one question might arise when using UCM attachments. How can I get them from within the process? The first answer would be to use the same getTaskAttachmentContents() XPath function already explained in my previous blog entry. In fact, that's the way it should be. But in Oracle BPM 11g 11.1.1.5.1 (PS4FP) and 11.1.1.6.0 (PS5) there's a bug that prevents you to do that. If you invoke such function against a UCM-attachment, you'll get a null content response (bug#13907552). Even if the attachment was correctly uploaded. While this bug gets fixed, next I will show a workaround that lets me to retrieve the UCM-attached documents from within a BPM process. Besides, the sample will show how to interact with WCC API from within a BPM process.Aside note: I suggest you to read my previous blog entry about Human Task attachments where I briefly describe some concepts that are used next, such as the execData/attachment[] structure. Sample Process I will be using the following sample process: A dummy UserTask using "HumanTask2" Human Task, followed by an Embedded Subprocess that will retrieve the attachments payload. In this case, and here's the key point of the sample, we will retrieve such payload using WebCenter Content WebService API (IDC): and once retrieved, we will write each of them back to a file in the server using a File Adapter service: In detail:  We will use the same attachmentCollection XSD structure and same BusinessObject definition as in the previous blog entry. However we create a separate variable, named attachmentUCM, based on such BusinessObject. We will still need to keep a copy of the HumanTask output's execData structure. Therefore we need to create a new variable of type TaskExecutionData (different one than the other used for non-UCM attachments): As in the non-UCM attachments flow, in the output tab of the UserTask mapping, we'll keep a copy of the execData structure: Now we get into the embedded subprocess that will retrieve the attachments' payload. First, and using an XSLT transformation, we feed the attachmentUCM variable with the following information: The name of each attachment (from execData/attachment/name element) The WebCenter Content ID of the uploaded attachment. This info is stored in execData/attachment/URI element with the format ecm://<id>. As we just want the numeric <id>, we need to get rid of the protocol prefix ("ecm://"). We do so with some XPath functions as detailed below: with these two functions being invoked, respectively: We, again, set the target payload element with an empty string, to get the <payload></payload> tag created. The complete XSLT transformation is shown below. Remember that we're using the XSLT for-each node to create as many target structures as necessary.  Once we have fed the attachmentsUCM structure and so it now contains the name of each of the attachments along with each WCC unique id (dID), it is time to iterate through it and get the payload. Therefore we will use a new embedded subprocess of type MultiInstance, that will iterate over the attachmentsUCM/attachment[] element: In each iteration we will use a Service activity that invokes WCC API through a WebService. Follow these steps to create and configure the Partner Link needed: Login to WCC console with an administrator user (i.e. weblogic). Go to Administration menu and click on "Soap Wsdls" link. We will use the GetFile service to retrieve a file based on its dID. Thus we'll need such service WSDL definition that can be downloaded by clicking the GetFile link. Save the WSDL file in your JDev project folder. In the BPM project's composite view, drag & drop a WebService adapter to create a new External Reference, based on the just added GetFile.wsdl. Name it UCM_GetFile. WCC services are secured through basic HTTP authentication. Therefore we need to enable the just created reference for that: Right-click the reference and click on Configure WS Policies. Under the Security section, click "+" to add the "oracle/wss_username_token_client_policy" policy The last step is to set the credentials for the security policy. For the sample we will use the admin user for WCC (weblogic/welcome1). Open the composite.xml file and select the Source view. Search for the UCM_GetFile entry and add the following highlighted elements into it:   <reference name="UCM_GetFile" ui:wsdlLocation="GetFile.wsdl">     <interface.wsdl interface="http://www.stellent.com/GetFile/#wsdl.interface(GetFileSoap)"/>     <binding.ws port="http://www.stellent.com/GetFile/#wsdl.endpoint(GetFile/GetFileSoap)"                 location="GetFile.wsdl" soapVersion="1.1">       <wsp:PolicyReference URI="oracle/wss_username_token_client_policy"                            orawsp:category="security" orawsp:status="enabled"/>       <property name="weblogic.wsee.wsat.transaction.flowOption"                 type="xs:string" many="false">WSDLDriven</property>       <property name="oracle.webservices.auth.username"                 type="xs:string">weblogic</property>       <property name="oracle.webservices.auth.password"                 type="xs:string">welcome1</property>     </binding.ws>   </reference> Now the new external reference is ready: Once the reference has just been created, we should be able now to use it from our BPM process. However we find here a problem. The WCC GetFile service operation that we will use, GetFileByID, accepts as input a structure similar to this one, where all element tags are optional: <get:GetFileByID xmlns:get="http://www.stellent.com/GetFile/">    <get:dID>?</get:dID>   <get:rendition>?</get:rendition>   <get:extraProps>      <get:property>         <get:name>?</get:name>         <get:value>?</get:value>      </get:property>   </get:extraProps></get:GetFileByID> and we need to fill up just the <get:dID> tag element. Due to some kind of restriction or bug on WCC, the rest of the tag elements must NOT be sent, not even empty (i.e.: <get:rendition></get:rendition> or <get:rendition/>). A sample request that performs the query just by the dID, must be in the following format: <get:GetFileByID xmlns:get="http://www.stellent.com/GetFile/">   <get:dID>12345</get:dID></get:GetFileByID> The issue here is that the simple mapping in BPM does create empty tags being a sample result as follows: <get:GetFileByID xmlns:get="http://www.stellent.com/GetFile/"> <get:dID>12345</get:dID> <get:rendition/> <get:extraProps/> </get:GetFileByID> Although the above structure is perfectly valid, it is not accepted by WCC. Therefore, we need to bypass the problem. The workaround we use (many others are available) is to add a Mediator component between the BPM process and the Service that simply copies the input structure from BPM but getting rid of the empty tags. Follow these steps to configure the Mediator: Drag & drop a new Mediator component into the composite. Uncheck the creation of the SOAP bindings and use the Interface Definition from WSDL template and select the existing GetFile.wsdl Double click in the mediator to edit it. Add a static routing rule to the GetFileByID operation, of type Service and select References/UCM_GetFile/GetFileByID target service: Create the request and reply XSLT mappers: Make sure you map only the dID element in the request: And do an Auto-mapper for the whole response: Finally, we can now add and configure the Service activity in the BPM process. Drag & drop it to the embedded subprocess and select the NormalizedGetFile service and getFileByID operation: Map both the input: ...and the output: Once this embedded subprocess ends, we will have all attachments (name + payload) in the attachmentsUCM variable, which is the main goal of this sample. But in order to test everything runs fine, we finish the sample writing each attachment to a file. To that end we include a final embedded subprocess to concurrently iterate through each attachmentsUCM/attachment[] element: On each iteration we will use a Service activity that invokes a File Adapter write service. In here we have two important parameters to set. First, the payload itself. The file adapter awaits binary data in base64 format (string). We have to map it using XPath (Simple mapping doesn't recognize a String as a base64-binary valid target): Second, we must set the target filename using the Service Properties dialog box: Again, note how we're making use of the loopCounter index variable to get the right element within the embedded subprocess iteration. Final blog entry about attachments will handle how to inject documents to Human Tasks from the BPM process and how to share attachments between different User Tasks. Will come soon. Again, once I finish will all posts on this matter, I will upload the whole sample project to java.net.

    Read the article

  • MINSCN?Cache Fusion Read Consistent

    - by Liu Maclean(???)
    ????? ???Ask Maclean Home ???  RAC ? Past Image PI????, ?????????,???11.2.0.3 2 Node RAC??????????: SQL> select * from v$version; BANNER -------------------------------------------------------------------------------- Oracle Database 11g Enterprise Edition Release 11.2.0.3.0 - 64bit Production PL/SQL Release 11.2.0.3.0 - Production CORE 11.2.0.3.0 Production TNS for Linux: Version 11.2.0.3.0 - Production NLSRTL Version 11.2.0.3.0 - Production SQL> select * from global_name; GLOBAL_NAME -------------------------------------------------------------------------------- www.oracledatabase12g.com SQL> drop table test purge; Table dropped. SQL> alter system flush buffer_cache; System altered. SQL> create table test(id number); insert into test values(1); insert into test values(2); commit; /* ???? rowid??TEST????2????????? */ select dbms_rowid.rowid_block_number(rowid),dbms_rowid.rowid_relative_fno(rowid) from test; DBMS_ROWID.ROWID_BLOCK_NUMBER(ROWID) DBMS_ROWID.ROWID_RELATIVE_FNO(ROWID) ------------------------------------ ------------------------------------                                89233                                    1                                89233                                    1 SQL> alter system flush buffer_cache; System altered. Instance 1 Session A ??UPDATE??: SQL> update test set id=id+1 where id=1; 1 row updated. Instance 1 Session B ??x$BH buffer header?? ?? ??Buffer??? SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;      STATE CR_SCN_BAS ---------- ----------          1          0          3    1227595 X$BH ??? STATE????Buffer???, ???????: STATE NUMBER KCBBHFREE 0 buffer free KCBBHEXLCUR 1 buffer current (and if DFS locked X) KCBBHSHRCUR 2 buffer current (and if DFS locked S) KCBBHCR 3 buffer consistant read KCBBHREADING 4 Being read KCBBHMRECOVERY 5 media recovery (current & special) KCBBHIRECOVERY 6 Instance recovery (somewhat special) ????????????? : state =1 Xcurrent ? state=2 Scurrent ? state=3 CR ??? Instance 2  ?? ????????????? ,???? gc current block 2 way  ??Current Block ??? Instance 2, ?? Instance 1 ??”Current Block” Convert ? Past Image: Instance 2 Session C SQL> update test set id=id+1 where id=2; 1 row updated. Instance 2 Session D SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 1 0 3 1227641 3 1227638 STATE =1 ?Xcurrent block???? Instance 2 , ??? Instance 1 ??? GC??: Instance 1 Session B SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;      STATE CR_SCN_BAS ---------- ----------          3    1227641          3    1227638          8          0          3    1227595 ???????, ??????Instance 1??session A????TEST??SELECT??? ,????? 3? State=3?CR ? ??????1?: Instance 1 session A ?????UPDATE? session SQL> alter session set events '10046 trace name context forever,level 8:10708 trace name context forever,level 103: trace[rac.*] disk high'; Session altered. SQL> select * from test;         ID ----------          2          2 select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;       STATE CR_SCN_BAS ---------- ----------          3    1227716          3    1227713          8          0 ?????????v$BH ????CR????,?????SELECT??? CR????????,???????? ?????????? ??X$BH?????? , ?????CR??SCN Version???: 1227641?1227638?1227595, ?SELECT?????? 2???SCN version?CR? 1227716? 1227713 ???, Oracle????????? ?????????SELECT??????event 10708? rac.*???TRACE,??????TRACE??: PARSING IN CURSOR #140444679938584 len=337 dep=1 uid=0 oct=3 lid=0 tim=1335698913632292 hv=3345277572 ad='bc0e68c8' sqlid='baj7tjm3q9sn4' SELECT /* OPT_DYN_SAMP */ /*+ ALL_ROWS IGNORE_WHERE_CLAUSE NO_PARALLEL(SAMPLESUB) opt_param('parallel_execution_enabled', 'false') NO_PARALLEL_INDEX(SAMPLESUB) NO_SQL_TUNE */ NVL(SUM(C1),0), NVL(SUM(C2),0) FROM (SELECT /*+ NO_PARALLEL("TEST") FULL("TEST") NO_PARALLEL_INDEX("TEST") */ 1 AS C1, 1 AS C2 FROM "SYS"."TEST" "TEST") SAMPLESUB END OF STMT PARSE #140444679938584:c=1000,e=27630,p=0,cr=0,cu=0,mis=1,r=0,dep=1,og=1,plh=1950795681,tim=1335698913632252 EXEC #140444679938584:c=0,e=44,p=0,cr=0,cu=0,mis=0,r=0,dep=1,og=1,plh=1950795681,tim=1335698913632390 *** 2012-04-29 07:28:33.632 kclscrs: req=0 block=1/89233 *** 2012-04-29 07:28:33.632 kclscrs: bid=1:3:1:0:7:80:1:0:4:0:0:0:1:2:4:1:26:0:0:0:70:1a:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:4:3:2:1:2:0:3f:0:1c:86:2d:4:0:0:0:0:a2:3c:7c:b:70:1a:0:0:0:0:1:0:7a:f8:76:1d:1:2:dc:5:a9:fe:17:75:0:0:0:0:0:0:0:0:0:0:0:0:63:e5:0:0:0:0:0:0:10:0:0:0 2012-04-29 07:28:33.632578 : kjbcrc[0x15c91.1 76896.0][9] 2012-04-29 07:28:33.632616 : GSIPC:GMBQ: buff 0xba1e8f90, queue 0xbb79f278, pool 0x60013fa0, freeq 1, nxt 0xbb79f278, prv 0xbb79f278 2012-04-29 07:28:33.632634 : kjbmscrc(0x15c91.1)seq 0x2 reqid=0x1c(shadow 0xb4bb4458,reqid x1c)mas@2(infosz 200)(direct 1) 2012-04-29 07:28:33.632654 : kjbsentscn[0x0.12bbc1][to 2] 2012-04-29 07:28:33.632669 : GSIPC:SENDM: send msg 0xba1e9000 dest x20001 seq 24026 type 32 tkts xff0000 mlen x17001a0 2012-04-29 07:28:33.633385 : GSIPC:KSXPCB: msg 0xba1e9000 status 30, type 32, dest 2, rcvr 1 *** 2012-04-29 07:28:33.633 kclwcrs: wait=0 tm=689 *** 2012-04-29 07:28:33.633 kclwcrs: got 1 blocks from ksxprcv WAIT #140444679938584: nam='gc cr block 2-way' ela= 689 p1=1 p2=89233 p3=1 obj#=76896 tim=1335698913633418 2012-04-29 07:28:33.633490 : kjbcrcomplete[0x15c91.1 76896.0][0] 2012-04-29 07:28:33.633510 : kjbrcvdscn[0x0.12bbc1][from 2][idx 2012-04-29 07:28:33.633527 : kjbrcvdscn[no bscn <= rscn 0x0.12bbc1][from 2] *** 2012-04-29 07:28:33.633 kclwcrs: req=0 typ=cr(2) wtyp=2hop tm=689 ??TRACE???? ?????????TEST??????, ???????Dynamic Sampling?????,???????TEST?? CR???,???????’gc cr block 2-way’ ??: 2012-04-29 07:28:33.632654 : kjbsentscn[0x0.12bbc1][to 2] 12bbc1= 1227713  ???X$BH????CR???,kjbsentscn[0x0.12bbc1][to 2] ????? ? Instance 2 ???SCN=12bbc1=1227713   DBA=0x15c91.1 76896.0 ?  CR Request(obj#=76896) ??kjbrcvdscn????? [no bscn <= rscn 0x0.12bbc1][from 2] ,??? ??receive? SCN Version =12bbc1 ???Best Version CR Server Arch ??????????????????SELECT??: PARSING IN CURSOR #140444682869592 len=18 dep=0 uid=0 oct=3 lid=0 tim=1335698913635874 hv=1689401402 ad='b1a188f0' sqlid='c99yw1xkb4f1u' select * from test END OF STMT PARSE #140444682869592:c=4999,e=34017,p=0,cr=7,cu=0,mis=1,r=0,dep=0,og=1,plh=1357081020,tim=1335698913635870 EXEC #140444682869592:c=0,e=23,p=0,cr=0,cu=0,mis=0,r=0,dep=0,og=1,plh=1357081020,tim=1335698913635939 WAIT #140444682869592: nam='SQL*Net message to client' ela= 7 driver id=1650815232 #bytes=1 p3=0 obj#=0 tim=1335698913636071 *** 2012-04-29 07:28:33.636 kclscrs: req=0 block=1/89233 *** 2012-04-29 07:28:33.636 kclscrs: bid=1:3:1:0:7:83:1:0:4:0:0:0:1:2:4:1:26:0:0:0:70:1a:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:0:4:3:2:1:2:0:2:0:1c:86:2d:4:0:0:0:0:a2:3c:7c:b:70:1a:0:0:0:0:1:0:7d:f8:76:1d:1:2:dc:5:a9:fe:17:75:0:0:0:0:0:0:0:0:0:0:0:0:63:e5:0:0:0:0:0:0:10:0:0:0 2012-04-29 07:28:33.636209 : kjbcrc[0x15c91.1 76896.0][9] 2012-04-29 07:28:33.636228 : GSIPC:GMBQ: buff 0xba0e5d50, queue 0xbb79f278, pool 0x60013fa0, freeq 1, nxt 0xbb79f278, prv 0xbb79f278 2012-04-29 07:28:33.636244 : kjbmscrc(0x15c91.1)seq 0x3 reqid=0x1d(shadow 0xb4bb4458,reqid x1d)mas@2(infosz 200)(direct 1) 2012-04-29 07:28:33.636252 : kjbsentscn[0x0.12bbc4][to 2] 2012-04-29 07:28:33.636358 : GSIPC:SENDM: send msg 0xba0e5dc0 dest x20001 seq 24029 type 32 tkts xff0000 mlen x17001a0 2012-04-29 07:28:33.636861 : GSIPC:KSXPCB: msg 0xba0e5dc0 status 30, type 32, dest 2, rcvr 1 *** 2012-04-29 07:28:33.637 kclwcrs: wait=0 tm=865 *** 2012-04-29 07:28:33.637 kclwcrs: got 1 blocks from ksxprcv WAIT #140444682869592: nam='gc cr block 2-way' ela= 865 p1=1 p2=89233 p3=1 obj#=76896 tim=1335698913637294 2012-04-29 07:28:33.637356 : kjbcrcomplete[0x15c91.1 76896.0][0] 2012-04-29 07:28:33.637374 : kjbrcvdscn[0x0.12bbc4][from 2][idx 2012-04-29 07:28:33.637389 : kjbrcvdscn[no bscn <= rscn 0x0.12bbc4][from 2] *** 2012-04-29 07:28:33.637 kclwcrs: req=0 typ=cr(2) wtyp=2hop tm=865 ???, “SELECT * FROM TEST”??????’gc cr block 2-way’??:2012-04-29 07:28:33.637374 : kjbrcvdscn[0x0.12bbc4][from 2][idx 2012-04-29 07:28:33.637389 : kjbrcvdscn[no bscn ??Foreground Process? Remote LMS??got?? SCN=1227716 Version?CR, ??? ?????X$BH ?????scn??? ??????????Instance 1????2?SCN???CR?, ???????????Instance 1 Buffer Cache?? ??SCN Version ???CR ??????? ?????????: SQL> alter system set "_enable_minscn_cr"=false scope=spfile; System altered. SQL> alter system set "_db_block_max_cr_dba"=20 scope=spfile; System altered. SQL> startup force; ORA-32004: obsolete or deprecated parameter(s) specified for RDBMS instance ORACLE instance started. Total System Global Area 1570009088 bytes Fixed Size 2228704 bytes Variable Size 989859360 bytes Database Buffers 570425344 bytes Redo Buffers 7495680 bytes Database mounted. Database opened. ???? “_enable_minscn_cr”=false ? “_db_block_max_cr_dba”=20 ???RAC????, ??????: ?Instance 2 Session C ?update??????? ?????Instance 1 ????? ,????Instance 1?Request CR SQL> update test set id=id+1 where id=2; -- Instance 2 1 row updated. SQL> select * From test; -- Instance 1 ID ---------- 1 2 ??? Instance 1? X$BH?? select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0;  STATE CR_SCN_BAS ---------- ---------- 3 1273080 3 1273071 3 1273041 3 1273039 8 0 SQL> update test set id=id+1 where id=3; 1 row updated. SQL> select * From test; ID ---------- 1 2 SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1273091 3 1273080 3 1273071 3 1273041 3 1273039 8 0 ................... SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1273793 3 1273782 3 1273780 3 1273769 3 1273734 3 1273715 3 1273691 3 1273679 3 1273670 3 1273643 3 1273635 3 1273623 3 1273106 3 1273091 3 1273080 3 1273071 3 1273041 3 1273039 3 1273033 19 rows selected. SQL> select state,cr_scn_bas from x$bh where file#=1 and dbablk=89233 and state!=0; STATE CR_SCN_BAS ---------- ---------- 3 1274993 ????? ???? “_enable_minscn_cr”(enable/disable minscn optimization for CR)=false ? “_db_block_max_cr_dba”=20 (Maximum Allowed Number of CR buffers per dba) 2? ??? ????? Instance 1 ??????????? ?? 19????CR?? “_enable_minscn_cr”?11g??????????,???Oracle????CR????SCN,?Foreground Process Receive????????????(SCN??)?SCN Version CR Block??????CBC?? SCN??????CR? , ?????????Buffer Cache??????? ????SCN Version?CR????,????? ?????????? ?????Snap_Scn ?? SCN?? ?????????Current SCN, ??????CR??????????????????????, ????Buffer Cache? ?????????? CR?????????, ?????? “_db_block_max_cr_dba” ???????, ???????????20 ,??????Buffer Cache?????19????CR?; ???”_db_block_max_cr_dba” ???????6 , ?????Buffer cache????????CR ???????6?? ??”_enable_minscn_cr” ??CR???MINSCN ??????, ?????????CR???????, ????? Foreground Process??????CR Request , ?? Holder Instance LMS ?build?? BEST CR ??, ?????????

    Read the article

  • CodePlex Daily Summary for Wednesday, March 16, 2011

    CodePlex Daily Summary for Wednesday, March 16, 2011Popular ReleasesuComponents: uComponents v2.1 RTM: What's new in v2.1? 5 new DataTypes JSON Datasource DropDown Multiple Textstring Similarity Text Image XPath DropDownList Please note that the release of DataType Grid has been postponed until v2.2. 3 new XSLT extensions Media Nodes Search Multi-node tree picker updates XPath start node selectors From Global or Relative start node selectors Max & Min node selection limits Bug fixes If you find a bug or have an issue, please raise a ticket here on CodePlex for us and we'l...Facebook C# SDK: 5.0.6 (BETA): This is seventh BETA release of the version 5 branch of the Facebook C# SDK. Remember this is a BETA build. Some things may change or not work exactly as planned. We are absolutely looking for feedback on this release to help us improve the final 5.X.X release. New in this release: Version 5.0.6 is almost completely backward compatible with 4.2.1 and 5.0.3 (BETA) Bug fixes and helpers to simplify many common scenarios For more information about this release see the following blog posts: F...SQLCE Code Generator: Build 1.0.3: New beta of the SQLCE Code Generator. New features: - Generates an IDataRepository interface that contains the generated repository interfaces that represents each table - Visual Studio 2010 Custom Tool Support Custom Tool: The custom tool is called SQLCECodeGenerator. Write this in the Custom Tool field in the Properties Window of an SDF file included in your project, this should create a code-behind file for the generated data access codeKooboo CMS: Kooboo 3.0 RC: Bug fixes Inline editing toolbar positioning for websites with complicate CSS. Inline editing is turned on by default now for the samplesite template. MongoDB version content query for multiple filters. . Add a new 404 page to guide users to login and create first website. Naming validation for page name and datarule name. Files in this download kooboo_CMS.zip: The Kooboo application files Content_DBProvider.zip: Additional content database implementation of MSSQL,SQLCE, RavenDB ...SQL Monitor - tracking sql server activities: SQL Monitor 3.2: 1. introduce sql color syntax highlighting with http://www.codeproject.com/KB/edit/FastColoredTextBox_.aspxUmbraco CMS: Umbraco 4.7.0: Service release fixing 50+ issues! Getting Started A great place to start is with our Getting Started Guide: Getting Started Guide: http://umbraco.codeplex.com/Project/Download/FileDownload.aspx?DownloadId=197051 Make sure to check the free foundation videos on how to get started building Umbraco sites. They're available from: Introduction for webmasters: http://umbraco.tv/help-and-support/video-tutorials/getting-started Understand the Umbraco concepts: http://umbraco.tv/help-and-support...ProDinner - ASP.NET MVC EF4 Code First DDD jQuery Sample App: first release: ProDinner is an ASP.NET MVC sample application, it uses DDD, EF4 Code First for Data Access, jQuery and MvcProjectAwesome for Web UI, it has Multi-language User Interface Features: CRUD and search operations for entities Multi-Language User Interface upload and crop Images (make thumbnail) for meals pagination using "more results" button very rich and responsive UI (using Mvc Project Awesome) Multiple UI themes (using jQuery UI themes)BEPUphysics: BEPUphysics v0.15.0: BEPUphysics v0.15.0!LiveChat Starter Kit: LCSK v1.1: This release contains couple of new features and bug fixes including: Features: Send chat transcript via email Operator can now invite visitor to chat (pro-active chat request) Bug Fixes: Operator management (Save and Delete) bug fixes Operator Console chat small fixesIronRuby: 1.1.3: IronRuby 1.1.3 is a servicing release that keeps on improving compatibility with Ruby 1.9.2 and includes IronRuby integration to Visual Studio 2010. We decided to drop 1.8.6 compatibility mode in all post-1.0 releases. We recommend using IronRuby 1.0 if you need 1.8.6 compatibility. The main purpose of this release is to sync with IronPython 2.7 release, i.e. to keep the Dynamic Language Runtime that both these languages build on top shareable. This release also fixes a few bugs: 5763 Use...SQL Server PowerShell Extensions: 2.3.2.1 Production: Release 2.3.2.1 implements SQLPSX as PowersShell version 2.0 modules. SQLPSX consists of 13 modules with 163 advanced functions, 2 cmdlets and 7 scripts for working with ADO.NET, SMO, Agent, RMO, SSIS, SQL script files, PBM, Performance Counters, SQLProfiler, Oracle and MySQL and using Powershell ISE as a SQL and Oracle query tool. In addition optional backend databases and SQL Server Reporting Services 2008 reports are provided with SQLServer and PBM modules. See readme file for details.IronPython: 2.7: On behalf of the IronPython team, I'm very pleased to announce the release of IronPython 2.7. This release contains all of the language features of Python 2.7, as well as several previously missing modules and numerous bug fixes. IronPython 2.7 also includes built-in Visual Studio support through IronPython Tools for Visual Studio. IronPython 2.7 requires .NET 4.0 or Silverlight 4. To download IronPython 2.7, visit http://ironpython.codeplex.com/releases/view/54498. Any bugs should be report...XML Explorer: XML Explorer 4.0.2: Changes in 4.0: This release is built on the Microsoft .NET Framework 4 Client Profile. Changed XSD validation to use the schema specified by the XML documents. Added a VS style Error List, double-clicking an error takes you to the offending node. XPathNavigator schema validation finally gives SourceObject (was fixed in .NET 4). Added Namespaces window and better support for XPath expressions in documents with a default namespace. Added ExpandAll and CollapseAll toolbar buttons (in a...Mobile Device Detection and Redirection: 1.0.0.0: Stable Release 51 Degrees.mobi Foundation has been in beta for some time now and has been used on thousands of websites worldwide. We’re now highly confident in the product and have designated this release as stable. We recommend all users update to this version. New Capabilities MappingsTo improve compatibility with other libraries some new .NET capabilities are now populated with wurfl data: “maximumRenderedPageSize” populated with “max_deck_size” “rendersBreaksAfterWmlAnchor” populated ...ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.7.3: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager added interactive search for the lookupWPF Inspector: WPF Inspector 0.9.7: New Features in Version 0.9.7 - Support for .NET 3.5 and 4.0 - Multi-inspection of the same process - Property-Filtering for multiple keywords e.g. "Height Width" - Smart Element Selection - Select Controls by clicking CTRL, - Select Template-Parts by clicking CTRL+SHIFT - Possibility to hide the element adorner (over the context menu on the visual tree) - Many bugfixes??????????: All-In-One Code Framework ??? 2011-03-10: http://download.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1codechs&DownloadId=216140 ??,????。??????????All-In-One Code Framework ???,??20?Sample!!????,?????。http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 ASP.NET ??: CSASPNETBingMaps VBASPNETRemoteUploadAndDownload CS/VBASPNETSerializeJsonString CSASPNETIPtoLocation CSASPNETExcelLikeGridView ....... Winform??: FTPDownload FTPUpload MultiThreadedWebDownloader...Rawr: Rawr 4.1.0: Note: This release may say 4.0.21 in the version bar. This is a typo and the version is actually 4.1.0, not to be confused with 4.0.10 which was released a while back. Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of the 4.0.16 release, you can now also begin using the new Downloadable WPF version of Rawr!This is a Relea...PHP Manager for IIS: PHP Manager 1.1.2 for IIS 7: This is a localization release of PHP Manager for IIS 7. It contains all the functionality available in 56962 plus a few bug fixes (see change list for more details). Most importantly this release is translated into five languages: German - the translation is provided by Christian Graefe Dutch - the translation is provided by Harrie Verveer Turkish - the translation is provided by Yusuf Oztürk Japanese - the translation is provided by Kenichi Wakasa Russian - the translation is provid...TweetSharp: TweetSharp v2.0.0: Documentation for this release may be found at http://tweetsharp.codeplex.com/wikipage?title=UserGuide&referringTitle=Documentation. Beta ChangesAdded user streams support Serialization is not attempted for Twitter 5xx errors Fixes based on feedback Third Party Library VersionsHammock v1.2.0: http://hammock.codeplex.com Json.NET 4.0 Release 1: http://json.codeplex.comNew ProjectsABSC - Automatic Battle System Configurator: ABSC - Automatic Battle System Configurator Este é um aplicativo que auxilia os usuários na configuração de diversos script's de batalha disponíveis atualmente para o Rpg Maker. O aplicativo irá gerar um script com toda a configuração especificada pelo usuário.Active Directory Monkey: Designed for IS support teams to easily reset active directoy passwords.Ag-Light: Artesis projectAnito.ORM: Anito ORMblogengine customizations: Customizations for dotnetblogengineCool Tool: Cool Tool is a Visual studio add-in for generating business entities. Generator provides functionality to be able easily and comfortable generate class elements and implement chosen interfaces. Project is developed in VS 2010 (C# 4.0). Enjoy!csmpfit - A Least Squares Optimization Library in C# (C Sharp): A C# port of the C-based mpfit Levenberg Marquardt solver at Argonne National Labs (http://cow.physics.wisc.edu/~craigm/idl/cmpfit.html), including both desktop .NET and Silverlight project libraries.DirectX 11 Framework for Experimentation: Basic Framework for DirectX 11 (without DXUT) containing basic stuffs like Text Rendering, Quad Render, Model Loading, basic Skinning animation, Shader framework (substitute for the effect API) and lots of random stuffs !!!Doanvien code project: DoanVien code projectdtweet - a dashing way of tweeting: dtweet is developed in ASP.NET MVC 3 RTM (Razor) C# JQuery 1.5.1FormMail.NET: This is a project to support emailing form data from a .NET page, and storing that form data in an XML file.LRU Cache: This project implements the LRU Cache using C#. It uses a Dictionary and a LinkedList. Dictionary ensures fast access to the data, and the linkedlist controls which objects are to be removed first.Magelia WebStore Open-source e-commerce software: Magelia WebStore is a customizable, multilingual and multi-currency open-source e-commerce software for the .net environment. WebStore was developped C# and Aspx and only requires an SQL Server Express. MDA.Net: MDA.Net is the .Net/Silverlight port of mda-vst instruments and effects. Currently it includes just the MDA piano and an overdrive with basic interfaces to build on. Just PM me if you have some time to help - we are in no rush, aim to migrate everything one-by-one. MVC NGShop: NGShop ????????????,????Asp.net MVC 2.0 + Jquery + SQL Server 2008, ???Castle Windsor IOC、entity framework,??????visual studio 2010??,??????vs2010,?????js???。Orchard - Photo Albums module: This module allows you to create photo albums with various effects: lightbox, slideshow, etc. PowerShell Workflow: PowerShell Workflow helps organizations to define their operational processes through the power of Workflow and Powershell. Project Dark: Early version of our project. Made in Torque 3dReFSharp: ReFSharp is pretty printer, analyzing tool and translator source code texts between F#, C# and Java. Current version has full functional pretty printer for F# and pre-alpha translator of C# to F#.Relate Intranet Templates: <project name> is an open source package for EPiServer Relate which creates a foundation for an intranet.Service monitor: Service monitor is a simple utility that lets you monitor and manage the states of services of multiple machines. It allows starting/stopping and restarting. It is Windows 7 UAC aware.Sharp Console: Sharp Console is a Windows Command Line (WCL)alternative written in C#. It targets those who lack access to the WCL or simply wish to use the NET framework instead. It aims to provide the same (and more!) functionality as the WCL. Contribute anything you feel will make it better!Stone2: New version of Stone, but using TFS for versioningStudent Database Management System: Student Database Management System is a sample Online Web based and also desktop application which helps school to maintain their records free online. this project is open source project and Free available for all schools to check and send their response..

    Read the article

  • Five Ways Enterprise 2.0 Can Transform Your Business - Q&A from the Webcast

    - by [email protected]
    A few weeks ago, Vince Casarez and I presented with KMWorld on the Five Ways Enterprise 2.0 Can Transform Your Business. It was an enjoyable, interactive webcast in which Vince and I discussed the ways Enterprise 2.0 can transform your business and more importantly, highlighted key customer examples of how to do so. If you missed the webcast, you can catch a replay here. We had a lot of audience participation in some of the polls we conducted and in the Q&A session. We weren't able to address all of the questions during the broadcast, so we attempted to answer them here: Q: Which area within your firm focuses on Web 2.0? Meaning, do you find new departments developing just to manage the web 2.0 (Twitter, Facebook, etc.) user experience or are you structuring current departments? A: There are three distinct efforts within Oracle. The first is around delivery of these Web 2.0 services for enterprise deployments. This is the focus of the WebCenter team. The second effort is injecting these Web 2.0 services into use cases that drive the different enterprise applications. This effort is focused on how to manage these external services and bring them into a cohesive flow for marketing programs, customer care, and purchasing. The third effort is how we consume these services internally to enhance Oracle's business delivery. It leverages the technologies and use cases of the first two but also pushes the envelope with regards to future directions of these other two areas. Q: In a business, Web 2.0 is mostly like action logs. How can we leverage the official process practice versus the logs of a recent action? Example: a system configuration modified last night on a call out versus the official practice that everybody would use in the morning.A: The key thing to remember is that most Web 2.0 actions / activity streams today are based on collaboration and communication type actions. At least with public social sites like Facebook and Twitter. What we're delivering as part of the WebCenter Suite are not just these types of activities but also enterprise application activities. These enterprise application activities come from different application modules: purchasing, HR, order entry, sales opportunity, etc. The actions within these systems are normally tied to a business object or process: purchase order/customer, employee or department, customer and supplier, customer and product, respectively. Therefore, the activities or "logs" as you name them are able to be "typed" so that as a viewer, you can filter or decide to see only certain types of information. In your example, you could have a view that only showed you recent "configuration" changes and this could be right next to a view that showed off the items to be watched every morning. Q: It's great to hear about customers using the software but is there any plan for future webinars to show what the products/installs look like? That would be very helpful.A: We don't have a webinar planned to show off the install process. However, we have a viewlet that's posted on Oracle Technology Network. You can see it here:http://www.oracle.com/technetwork/testcontent/wcs-install-098014.htmlAnd we've got excellent documentation that walks you through the steps here:http://download.oracle.com/docs/cd/E14571_01/install.1111/e12001/install.htmAnd there's a whole set of demos and examples of what WebCenter can do at this URL:http://www.oracle.com/technetwork/middleware/webcenter/release11-demos-097468.html Q: How do you anticipate managing metadata across the enterprise to make content findable?A: We need to first make sure we are all talking about the same thing when we use a word like "metadata". Here's why...  For a developer, metadata means information that describes key elements of the portal or application and what the portal or application can do. For content systems, metadata means key terms that provide a taxonomy or folksonomy about the information that is being indexed, ordered, and managed. For business intelligence systems, metadata means key terms that provide labels to groups of data that most non-mathematicians need to understand. And for SOA, metadata means labels for parts of the processes that business owners should understand that connect development terminology. There are also additional requirements for metadata to be available to the team building these new solutions as well as requirements to make this metadata available to the running system. These requirements are often separated by "design time" and "run time" respectively. So clearly, a general goal of managing metadata across the enterprise is very challenging. We've invested a huge amount of resources around Oracle Metadata Services (MDS) to be able to provide a more generic system for all of these elements. No other vendor has anything like this technology foundation in their products. This provides a huge benefit to our customers as they will now be able to find content, processes, people, and information from a common set of search interfaces with consistent enterprise wide results. Q: Can you give your definition of terms as to document and content, please?A: Content applies to a broad category of information from Word documents, presentations and reports through attachments to invoices and/or purchase orders. Content is essentially any type of digital asset including images, video, and voice. A document is just one type of content. Q: Do you have special integration tools to realize an interaction between UCM and WebCenter Spaces/Services?A: Yes, we've dedicated a whole team of engineers to exploit the key features of Oracle UCM within WebCenter.  While ensuring that WebCenter can connect to other non-Oracle systems, we've made sure that with the combined set of Oracle technology, no other solution can match the combined power and integration.  This is part of the Oracle Fusion Middleware strategy which is to provide best in class capabilities for Content and Portals.  When combined together, the synergy between the two products enables users to quickly add capabilities when they are needed.  For example, simple document sharing is part of the combined product offering, but if legal discovery or archiving is required, Oracle UCM product includes these capabilities that can be quickly added.  There's no need to move content around or add another system to support this, it's just a feature that gets turned on within Oracle UCM. Q: All customers have some interaction with their applications and have many older versions, how do you see some of these new Enterprise 2.0 capabilities adding value to existing enterprise application deployments?A: Just as Service Oriented Architectures allowed for connecting the processes of different applications systems to work together, there's a need for a similar approach with regards to these enterprise 2.0 capabilities. Oracle WebCenter is built on a core architecture that allows for SOA of these Enterprise 2.0 services so that one set of scalable services can be used and integrated directly into any type of application. In this way, users can get immediate value out of the Enterprise 2.0 capabilities without having to wait for the next major release or upgrade. These centrally managed WebCenter services expose a set of standard interfaces that make it extremely easy to add them into existing applications no matter what technology the application has been implemented. Q: We've heard about Oracle Next Generation applications called "Fusion Applications", can you tell me how all this works together?A: Oracle WebCenter powers the core collaboration and social computing services found within Fusion Applications. It is the core user experience technology for how all the application screens have been implemented. And the core concept of task flows allows for all the Fusion Applications modules to be adaptable and composable by business users and IT without needing to be a professional developer. Oracle WebCenter is at the heart of the new Fusion Applications. In addition, the same patterns and technologies are now being added to the existing applications including JD Edwards, Siebel, Peoplesoft, and eBusiness Suite. The core technology enables all these customers to have a much smoother upgrade path to Fusion Applications. They get immediate benefits of injecting new user interactions into their existing applications without having to completely move to Fusion Applications. And then when the time comes, their users will already be well versed in how the new capabilities work. Q: Does any of this work with non Oracle software? Other databases? Other application servers? etc.A: We have made sure that Oracle WebCenter delivers the broadest set of development choices so that no matter what technology you developers are using, WebCenter capabilities can be quickly and easily added to the site or application. In addition, we have certified Oracle WebCenter to run against non-Oracle databases like DB2 and SQLServer. We have stated plans for certification against MySQL as well. Later in CY 2011, Oracle will provide certification on non-Oracle application servers such as WebSphere and JBoss. Q: How do we balance User and IT requirements in regards to Enterprise 2.0 technologies?A: Wrong decisions are often made because employee knowledge is not tapped efficiently and opportunities to innovate are often missed because the right people do not work together. Collaboration amongst workers in the right business context is critical for success. While standalone Enterprise 2.0 technologies can improve collaboration for collaboration's sake, using social collaboration tools in the context of business applications and processes will improve business responsiveness and lead companies to a more competitive position. As these systems become more mission critical it is essential that they maintain the highest level of performance and availability while scaling to support larger communities. Q: What are the ways in which Enterprise 2.0 can improve business responsiveness?A: With a wide range of Enterprise 2.0 tools in the marketplace, CIOs need to deploy solutions that will meet the requirements from users as well as address the requirements from IT. Workers want a next-generation user experience that is personalized and aggregates their daily tools and tasks, while IT needs to ensure the solution is secure, scalable, flexible, reliable and easily integrated with existing systems. An open and integrated approach to deploying portals, content management, and collaboration can enhance your business by addressing both the needs of knowledge workers for better information and the IT mandate to conserve resources by simplifying, consolidating and centralizing infrastructure and administration.  

    Read the article

  • Oracle's Global Single Schema

    - by david.butler(at)oracle.com
    Maximizing business process efficiencies in a heterogeneous environment is very difficult. The difficulty stems from the fact that the various applications across the Information Technology (IT) landscape employ different integration standards, different message passing strategies, and different workflow engines. Vendors such as Oracle and others are delivering tools to help IT organizations manage the complexities introduced by these differences. But the one remaining intractable problem impacting efficient operations is the fact that these applications have different definitions for the same business data. Business data is your business information codified for computer programs to use. A good data model will represent the way your organization does business. The computer applications your organization deploys to improve operational efficiency are built to operate on the business data organized into this schema.  If the schema does not represent how you do business, the applications on that schema cannot provide the features you need to achieve the desired efficiencies. Business processes span these applications. Data problems break these processes rendering them far less efficient than they need to be to achieve organization goals. Thus, the expected return on the investment in these applications is never realized. The success of all business processes depends on the availability of accurate master data.  Clearly, the solution to this problem is to consolidate all the master data an organization uses to run its business. Then clean it up, augment it, govern it, and connect it back to the applications that need it. Until now, this obvious solution has been difficult to achieve because no one had defined a data model sufficiently broad, deep and flexible enough to support transaction processing on all key business entities and serve as a master superset to all other operational data models deployed in heterogeneous IT environments. Today, the situation has changed. Oracle has created an operational data model (aka schema) that can support accurate and consistent master data across heterogeneous IT systems. This is foundational for providing a way to consolidate and integrate master data without having to replace investments in existing applications. This Global Single Schema (GSS) represents a revolutionary breakthrough that allows for true master data consolidation. Oracle has deep knowledge of applications dating back to the early 1990s.  It developed applications in the areas of Supply Chain Management (SCM), Product Lifecycle Management (PLM), Enterprise Resource Planning (ERP), Customer Relationship Management (CRM), Human Capital Management (HCM), Financials and Manufacturing. In addition, Oracle applications were delivered for key industries such as Communications, Financial Services, Retail, Public Sector, High Tech Manufacturing (HTM) and more. Expertise in all these areas drove requirements for GSS. The following figure illustrates Oracle's unique position that enabled the creation of the Global Single Schema. GSS Requirements Gathering GSS defines all the key business entities and attributes including Customers, Contacts, Suppliers, Accounts, Products, Services, Materials, Employees, Installed Base, Sites, Assets, and Inventory to name just a few. In addition, Oracle delivers GSS pre-integrated with a wide variety of operational applications.  Business Process Automation EBusiness is about maximizing operational efficiency. At the highest level, these 'operations' span all that you do as an organization.  The following figure illustrates some of these high-level business processes. Enterprise Business Processes Supplies are procured. Assets are maintained. Materials are stored. Inventory is accumulated. Products and Services are engineered, produced and sold. Customers are serviced. And across this entire spectrum, Employees do the procuring, supporting, engineering, producing, selling and servicing. Not shown, but not to be overlooked, are the accounting and the financial processes associated with all this procuring, manufacturing, and selling activity. Supporting all these applications is the master data. When this data is fragmented and inconsistent, the business processes fail and inefficiencies multiply. But imagine having all the data under these operational business processes in one place. ·            The same accurate and timely customer data will be provided to all your operational applications from the call center to the point of sale. ·            The same accurate and timely supplier data will be provided to all your operational applications from supply chain planning to procurement. ·            The same accurate and timely product information will be available to all your operational applications from demand chain planning to marketing. You would have a single version of the truth about your assets, financial information, customers, suppliers, employees, products and services to support your business automation processes as they flow across your business applications. All company and partner personnel will access the same exact data entity across all your channels and across all your lines of business. Oracle's Global Single Schema enables this vision of a single version of the truth across the heterogeneous operational applications supporting the entire enterprise. Global Single Schema Oracle's Global Single Schema organizes hundreds of thousands of attributes into 165 major schema objects supporting over 180 business application modules. It is designed for international operations, and extensibility.  The schema is delivered with a full set of public Application Programming Interfaces (APIs) and an Integration Repository with modern Service Oriented Architecture interfaces to make data available as a services (DaaS) to business processes and enable operations in heterogeneous IT environments. ·         Key tables can be extended with unlimited numbers of additional attributes and attribute groups for maximum flexibility.  o    This enables model extensions that reflect business entities unique to your organization's operations. ·         The schema is multi-organization enabled so data manipulation can be controlled along organizational boundaries. ·         It uses variable byte Unicode to support over 31 languages. ·         The schema encodes flexible date and flexible address formats for easy localizations. No matter how complex your business is, Oracle's Global Single Schema can hold your business objects and support your global operations. Oracle's Global Single Schema identifies and defines the business objects an enterprise needs within the context of its business operations. The interrelationships between the business objects are also contained within the GSS data model. Their presence expresses fundamental business rules for the interaction between business entities. The following figure illustrates some of these connections.   Interconnected Business Entities Interconnecte business processes require interconnected business data. No other MDM vendor has this capability. Everyone else has either one entity they can master or separate disconnected models for various business entities. Higher level integrations are made available, but that is a weak architectural alternative to data level integration in this critically important aspect of Master Data Management.    

    Read the article

  • Fun with Declarative Components

    - by [email protected]
    Use case background I have been asked on a number of occasions if our selectOneChoice component could allow random text to be entered, as well as having a list of selections available. Unfortunately, the selectOneChoice component only allows entry via the dropdown selection list and doesn't allow text entry. I was thinking of possible solutions and thought that this might make a good example for using a declarative component.My initial idea My first thought was to use an af:inputText to allow the text entry, and an af:selectOneChoice with mode="compact" for the selections. To get it to layout horizontally, we would want to use an af:panelGroupLayout with layout="horizontal". To get the label for this to line up correctly, we'll need to wrap the af:panelGroupLayout with an af:panelLabelAndMessage. This is the basic structure: <af:panelLabelAndMessage> <af:panelGroupLayout layout="horizontal"> <af:inputText/> <af:selectOneChoice mode="compact"/> </af:panelgroupLayout></af:panelLabelAndMessage> Make it into a declarative component One of the steps to making a declarative component is deciding what attributes we want to be able to specify. To keep this example simple, let's just have: 'label' (the label of our declarative component)'value' (what we want to bind to the value of the input text)'items' (the select items in our dropdown) Here is the initial declarative component code (saved as file "inputTextWithChoice.jsff"): <?xml version='1.0' encoding='UTF-8'?><!-- Copyright (c) 2008, Oracle and/or its affiliates. All rights reserved. --><jsp:root xmlns:jsp="http://java.sun.com/JSP/Page" version="2.1" xmlns:f="http://java.sun.com/jsf/core" xmlns:af="http://xmlns.oracle.com/adf/faces/rich"> <jsp:directive.page contentType="text/html;charset=utf-8"/> <af:componentDef var="attrs" componentVar="comp"> <af:xmlContent> <component xmlns="http://xmlns.oracle.com/adf/faces/rich/component"> <description>Input text with choice component.</description> <attribute> <description>Label</description> <attribute-name>label</attribute-name> <attribute-class>java.lang.String</attribute-class> </attribute> <attribute> <description>Value</description> <attribute-name>value</attribute-name> <attribute-class>java.lang.Object</attribute-class> </attribute> <attribute> <description>Choice Select Items Value</description> <attribute-name>items</attribute-name> <attribute-class>[[Ljavax.faces.model.SelectItem;</attribute-class> </attribute> </component> </af:xmlContent> <af:panelLabelAndMessage id="myPlm" label="#{attrs.label}" for="myIt"> <af:panelGroupLayout id="myPgl" layout="horizontal"> <af:inputText id="myIt" value="#{attrs.value}" partialTriggers="mySoc" label="myIt" simple="true" /> <af:selectOneChoice id="mySoc" label="mySoc" simple="true" mode="compact" value="#{attrs.value}" autoSubmit="true"> <f:selectItems id="mySIs" value="#{attrs.items}" /> </af:selectOneChoice> </af:panelGroupLayout> </af:panelLabelAndMessage> </af:componentDef></jsp:root> By having af:inputText and af:selectOneChoice both have the same value, then (assuming that this passed in as an EL expression) selecting something in the selectOneChoice will update the value in the af:inputText. To use this declarative component in a jspx page: <af:declarativeComponent id="myItwc" viewId="inputTextWithChoice.jsff" label="InputText with Choice" value="#{demoInput.choiceValue}" items="#{demoInput.selectItems}" /> Some problems arise At first glace, this seems to be functioning like we want it to. However, there is a side effect to having the af:inputText and af:selectOneChoice share a value, if one changes, so does the other. The problem here is that when we update the af:inputText to something that doesn't match one of the selections in the af:selectOneChoice, the af:selectOneChoice will set itself to null (since the value doesn't match one of the selections) and the next time the page is submitted, it will submit the null value and the af:inputText will be empty. Oops, we don't want that. Hmm, what to do. Okay, how about if we make sure that the current value is always available in the selection list. But, lets not render it if the value is empty. We also need to add a partialTriggers attribute so that this gets updated when the af:inputText is changed. Plus, we really don't want to select this item so let's disable it. <af:selectOneChoice id="mySoc" partialTriggers="myIt" label="mySoc" simple="true" mode="compact" value="#{attrs.value}" autoSubmit="true"> <af:selectItem id="mySI" label="Selected:#{attrs.value}" value="#{attrs.value}" disabled="true" rendered="#{!empty attrs.value}"/> <af:separator id="mySp" /> <f:selectItems id="mySIs" value="#{attrs.items}" /></af:selectOneChoice> That seems to be working pretty good. One minor issue that we probably can't do anything about is that when you enter something in the inputText and then click on the selectOneChoice, the popup is displayed, but then goes away because it has been replaced via PPR because we told it to with the partialTriggers="myIt". This is not that big a deal, since if you are entering something manually, you probably don't want to select something from the list right afterwards. Making it look like a single component. Now, let's play around a bit with the contentStyle of the af:inputText and the af:selectOneChoice so that the compact icon will layout inside the af:inputText, making it look more like an af:selectManyChoice. We need to add some padding-right to the af;inputText so there is space for the icon. These adjustments were for the Fusion FX skin. <af:inputText id="myIt" partialTriggers="mySoc" autoSubmit="true" contentStyle="padding-right: 15px;" value="#{attrs.value}" label="myIt" simple="true" /><af:selectOneChoice id="mySoc" partialTriggers="myIt" contentStyle="position: relative; top: -2px; left: -19px;" label="mySoc" simple="true" mode="compact" value="#{attrs.value}" autoSubmit="true"> <af:selectItem id="mySI" label="Selected:#{attrs.value}" value="#{attrs.value}" disabled="true" rendered="#{!empty attrs.value}"/> <af:separator id="mySp" /> <f:selectItems id="mySIs" value="#{attrs.items}" /></af:selectOneChoice> There you have it, a declarative component that allows for suggested selections, but also allows arbitrary text to be entered. This could be used for search field, where the 'items' attribute could be populated with popular searches. Lines of java code written: 0

    Read the article

  • Acer aspire 5735z wireless not working after upgrade to 11.10

    - by Jon
    I cant get my wifi card to work at all after upgrading to 11.10 Oneiric. I'm not sure where to start to fix this. Ive tried using the additional drivers tool but this shows that no additional drivers are needed. Before my upgrade I had a drivers working for the Rt2860 chipset. Any help on this would be much appreciated.... thanks Jon jon@ubuntu:~$ ifconfig eth0 Link encap:Ethernet HWaddr 00:1d:72:ec:76:d5 inet addr:192.168.1.134 Bcast:192.168.1.255 Mask:255.255.255.0 inet6 addr: fe80::21d:72ff:feec:76d5/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:7846 errors:0 dropped:0 overruns:0 frame:0 TX packets:7213 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:8046624 (8.0 MB) TX bytes:1329442 (1.3 MB) Interrupt:16 lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:91 errors:0 dropped:0 overruns:0 frame:0 TX packets:91 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:34497 (34.4 KB) TX bytes:34497 (34.4 KB) Ive included by dmesg output below [ 0.428818] NET: Registered protocol family 2 [ 0.429003] IP route cache hash table entries: 131072 (order: 8, 1048576 bytes) [ 0.430562] TCP established hash table entries: 524288 (order: 11, 8388608 bytes) [ 0.436614] TCP bind hash table entries: 65536 (order: 8, 1048576 bytes) [ 0.437409] TCP: Hash tables configured (established 524288 bind 65536) [ 0.437412] TCP reno registered [ 0.437431] UDP hash table entries: 2048 (order: 4, 65536 bytes) [ 0.437482] UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes) [ 0.437678] NET: Registered protocol family 1 [ 0.437705] pci 0000:00:02.0: Boot video device [ 0.437892] PCI: CLS 64 bytes, default 64 [ 0.437916] Simple Boot Flag at 0x57 set to 0x1 [ 0.438294] audit: initializing netlink socket (disabled) [ 0.438309] type=2000 audit(1319243447.432:1): initialized [ 0.440763] Freeing initrd memory: 13416k freed [ 0.468362] HugeTLB registered 2 MB page size, pre-allocated 0 pages [ 0.488192] VFS: Disk quotas dquot_6.5.2 [ 0.488254] Dquot-cache hash table entries: 512 (order 0, 4096 bytes) [ 0.488888] fuse init (API version 7.16) [ 0.488985] msgmni has been set to 5890 [ 0.489381] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 253) [ 0.489413] io scheduler noop registered [ 0.489415] io scheduler deadline registered [ 0.489460] io scheduler cfq registered (default) [ 0.489583] pcieport 0000:00:1c.0: setting latency timer to 64 [ 0.489633] pcieport 0000:00:1c.0: irq 40 for MSI/MSI-X [ 0.489699] pcieport 0000:00:1c.1: setting latency timer to 64 [ 0.489741] pcieport 0000:00:1c.1: irq 41 for MSI/MSI-X [ 0.489800] pcieport 0000:00:1c.2: setting latency timer to 64 [ 0.489841] pcieport 0000:00:1c.2: irq 42 for MSI/MSI-X [ 0.489904] pcieport 0000:00:1c.3: setting latency timer to 64 [ 0.489944] pcieport 0000:00:1c.3: irq 43 for MSI/MSI-X [ 0.490006] pcieport 0000:00:1c.4: setting latency timer to 64 [ 0.490047] pcieport 0000:00:1c.4: irq 44 for MSI/MSI-X [ 0.490126] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 [ 0.490149] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 [ 0.490196] intel_idle: MWAIT substates: 0x1110 [ 0.490198] intel_idle: does not run on family 6 model 15 [ 0.491240] ACPI: Deprecated procfs I/F for AC is loaded, please retry with CONFIG_ACPI_PROCFS_POWER cleared [ 0.493473] ACPI: AC Adapter [ADP1] (on-line) [ 0.493590] input: Lid Switch as /devices/LNXSYSTM:00/device:00/PNP0C0D:00/input/input0 [ 0.496771] ACPI: Lid Switch [LID0] [ 0.496818] input: Sleep Button as /devices/LNXSYSTM:00/device:00/PNP0C0E:00/input/input1 [ 0.496823] ACPI: Sleep Button [SLPB] [ 0.496865] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input2 [ 0.496869] ACPI: Power Button [PWRF] [ 0.496900] ACPI: acpi_idle registered with cpuidle [ 0.498719] Monitor-Mwait will be used to enter C-1 state [ 0.498753] Monitor-Mwait will be used to enter C-2 state [ 0.498761] Marking TSC unstable due to TSC halts in idle [ 0.517627] thermal LNXTHERM:00: registered as thermal_zone0 [ 0.517630] ACPI: Thermal Zone [TZS0] (67 C) [ 0.524796] thermal LNXTHERM:01: registered as thermal_zone1 [ 0.524799] ACPI: Thermal Zone [TZS1] (67 C) [ 0.524823] ACPI: Deprecated procfs I/F for battery is loaded, please retry with CONFIG_ACPI_PROCFS_POWER cleared [ 0.524852] ERST: Table is not found! [ 0.524948] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.680991] ACPI: Battery Slot [BAT0] (battery present) [ 0.688567] Linux agpgart interface v0.103 [ 0.688672] agpgart-intel 0000:00:00.0: Intel GM45 Chipset [ 0.688865] agpgart-intel 0000:00:00.0: detected gtt size: 2097152K total, 262144K mappable [ 0.689786] agpgart-intel 0000:00:00.0: detected 65536K stolen memory [ 0.689912] agpgart-intel 0000:00:00.0: AGP aperture is 256M @ 0xd0000000 [ 0.691006] brd: module loaded [ 0.691510] loop: module loaded [ 0.691967] Fixed MDIO Bus: probed [ 0.691990] PPP generic driver version 2.4.2 [ 0.692065] tun: Universal TUN/TAP device driver, 1.6 [ 0.692067] tun: (C) 1999-2004 Max Krasnyansky <[email protected]> [ 0.692146] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.692181] ehci_hcd 0000:00:1a.7: PCI INT C -> GSI 20 (level, low) -> IRQ 20 [ 0.692206] ehci_hcd 0000:00:1a.7: setting latency timer to 64 [ 0.692210] ehci_hcd 0000:00:1a.7: EHCI Host Controller [ 0.692255] ehci_hcd 0000:00:1a.7: new USB bus registered, assigned bus number 1 [ 0.692289] ehci_hcd 0000:00:1a.7: debug port 1 [ 0.696181] ehci_hcd 0000:00:1a.7: cache line size of 64 is not supported [ 0.696202] ehci_hcd 0000:00:1a.7: irq 20, io mem 0xf8904800 [ 0.712014] ehci_hcd 0000:00:1a.7: USB 2.0 started, EHCI 1.00 [ 0.712131] hub 1-0:1.0: USB hub found [ 0.712136] hub 1-0:1.0: 6 ports detected [ 0.712230] ehci_hcd 0000:00:1d.7: PCI INT A -> GSI 23 (level, low) -> IRQ 23 [ 0.712243] ehci_hcd 0000:00:1d.7: setting latency timer to 64 [ 0.712247] ehci_hcd 0000:00:1d.7: EHCI Host Controller [ 0.712287] ehci_hcd 0000:00:1d.7: new USB bus registered, assigned bus number 2 [ 0.712315] ehci_hcd 0000:00:1d.7: debug port 1 [ 0.716201] ehci_hcd 0000:00:1d.7: cache line size of 64 is not supported [ 0.716216] ehci_hcd 0000:00:1d.7: irq 23, io mem 0xf8904c00 [ 0.732014] ehci_hcd 0000:00:1d.7: USB 2.0 started, EHCI 1.00 [ 0.732130] hub 2-0:1.0: USB hub found [ 0.732135] hub 2-0:1.0: 6 ports detected [ 0.732209] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.732223] uhci_hcd: USB Universal Host Controller Interface driver [ 0.732254] uhci_hcd 0000:00:1a.0: PCI INT A -> GSI 20 (level, low) -> IRQ 20 [ 0.732262] uhci_hcd 0000:00:1a.0: setting latency timer to 64 [ 0.732265] uhci_hcd 0000:00:1a.0: UHCI Host Controller [ 0.732298] uhci_hcd 0000:00:1a.0: new USB bus registered, assigned bus number 3 [ 0.732325] uhci_hcd 0000:00:1a.0: irq 20, io base 0x00001820 [ 0.732441] hub 3-0:1.0: USB hub found [ 0.732445] hub 3-0:1.0: 2 ports detected [ 0.732508] uhci_hcd 0000:00:1a.1: PCI INT B -> GSI 20 (level, low) -> IRQ 20 [ 0.732514] uhci_hcd 0000:00:1a.1: setting latency timer to 64 [ 0.732518] uhci_hcd 0000:00:1a.1: UHCI Host Controller [ 0.732553] uhci_hcd 0000:00:1a.1: new USB bus registered, assigned bus number 4 [ 0.732577] uhci_hcd 0000:00:1a.1: irq 20, io base 0x00001840 [ 0.732696] hub 4-0:1.0: USB hub found [ 0.732700] hub 4-0:1.0: 2 ports detected [ 0.732762] uhci_hcd 0000:00:1a.2: PCI INT C -> GSI 20 (level, low) -> IRQ 20 [ 0.732768] uhci_hcd 0000:00:1a.2: setting latency timer to 64 [ 0.732772] uhci_hcd 0000:00:1a.2: UHCI Host Controller [ 0.732805] uhci_hcd 0000:00:1a.2: new USB bus registered, assigned bus number 5 [ 0.732829] uhci_hcd 0000:00:1a.2: irq 20, io base 0x00001860 [ 0.732942] hub 5-0:1.0: USB hub found [ 0.732946] hub 5-0:1.0: 2 ports detected [ 0.733007] uhci_hcd 0000:00:1d.0: PCI INT A -> GSI 23 (level, low) -> IRQ 23 [ 0.733014] uhci_hcd 0000:00:1d.0: setting latency timer to 64 [ 0.733017] uhci_hcd 0000:00:1d.0: UHCI Host Controller [ 0.733057] uhci_hcd 0000:00:1d.0: new USB bus registered, assigned bus number 6 [ 0.733082] uhci_hcd 0000:00:1d.0: irq 23, io base 0x00001880 [ 0.733202] hub 6-0:1.0: USB hub found [ 0.733206] hub 6-0:1.0: 2 ports detected [ 0.733265] uhci_hcd 0000:00:1d.1: PCI INT B -> GSI 17 (level, low) -> IRQ 17 [ 0.733273] uhci_hcd 0000:00:1d.1: setting latency timer to 64 [ 0.733276] uhci_hcd 0000:00:1d.1: UHCI Host Controller [ 0.733313] uhci_hcd 0000:00:1d.1: new USB bus registered, assigned bus number 7 [ 0.733351] uhci_hcd 0000:00:1d.1: irq 17, io base 0x000018a0 [ 0.733466] hub 7-0:1.0: USB hub found [ 0.733470] hub 7-0:1.0: 2 ports detected [ 0.733532] uhci_hcd 0000:00:1d.2: PCI INT C -> GSI 18 (level, low) -> IRQ 18 [ 0.733539] uhci_hcd 0000:00:1d.2: setting latency timer to 64 [ 0.733542] uhci_hcd 0000:00:1d.2: UHCI Host Controller [ 0.733578] uhci_hcd 0000:00:1d.2: new USB bus registered, assigned bus number 8 [ 0.733610] uhci_hcd 0000:00:1d.2: irq 18, io base 0x000018c0 [ 0.733730] hub 8-0:1.0: USB hub found [ 0.733736] hub 8-0:1.0: 2 ports detected [ 0.733843] i8042: PNP: PS/2 Controller [PNP0303:KBD0,PNP0f13:PS2M] at 0x60,0x64 irq 1,12 [ 0.751594] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.751605] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.751732] mousedev: PS/2 mouse device common for all mice [ 0.752670] rtc_cmos 00:08: RTC can wake from S4 [ 0.752770] rtc_cmos 00:08: rtc core: registered rtc_cmos as rtc0 [ 0.752796] rtc0: alarms up to one month, y3k, 242 bytes nvram, hpet irqs [ 0.752907] device-mapper: uevent: version 1.0.3 [ 0.752976] device-mapper: ioctl: 4.20.0-ioctl (2011-02-02) initialised: [email protected] [ 0.753028] cpuidle: using governor ladder [ 0.753093] cpuidle: using governor menu [ 0.753096] EFI Variables Facility v0.08 2004-May-17 [ 0.753361] TCP cubic registered [ 0.753482] NET: Registered protocol family 10 [ 0.753966] NET: Registered protocol family 17 [ 0.753992] Registering the dns_resolver key type [ 0.754113] PM: Hibernation image not present or could not be loaded. [ 0.754131] registered taskstats version 1 [ 0.771553] Magic number: 15:152:507 [ 0.771667] rtc_cmos 00:08: setting system clock to 2011-10-22 00:30:48 UTC (1319243448) [ 0.772238] BIOS EDD facility v0.16 2004-Jun-25, 0 devices found [ 0.772240] EDD information not available. [ 0.774165] Freeing unused kernel memory: 984k freed [ 0.774504] Write protecting the kernel read-only data: 10240k [ 0.774755] Freeing unused kernel memory: 20k freed [ 0.775093] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input3 [ 0.779727] Freeing unused kernel memory: 1400k freed [ 0.801946] udevd[84]: starting version 173 [ 0.880950] sky2: driver version 1.28 [ 0.881046] sky2 0000:02:00.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 0.881096] sky2 0000:02:00.0: setting latency timer to 64 [ 0.881197] sky2 0000:02:00.0: Yukon-2 Extreme chip revision 2 [ 0.881871] sky2 0000:02:00.0: irq 45 for MSI/MSI-X [ 0.896273] sky2 0000:02:00.0: eth0: addr 00:1d:72:ec:76:d5 [ 0.910630] ahci 0000:00:1f.2: version 3.0 [ 0.910647] ahci 0000:00:1f.2: PCI INT B -> GSI 19 (level, low) -> IRQ 19 [ 0.910710] ahci 0000:00:1f.2: irq 46 for MSI/MSI-X [ 0.910775] ahci: SSS flag set, parallel bus scan disabled [ 0.910812] ahci 0000:00:1f.2: AHCI 0001.0200 32 slots 4 ports 3 Gbps 0x33 impl SATA mode [ 0.910816] ahci 0000:00:1f.2: flags: 64bit ncq sntf stag pm led clo pio slum part ccc ems sxs [ 0.910821] ahci 0000:00:1f.2: setting latency timer to 64 [ 0.941773] scsi0 : ahci [ 0.941954] scsi1 : ahci [ 0.942038] scsi2 : ahci [ 0.942118] scsi3 : ahci [ 0.942196] scsi4 : ahci [ 0.942268] scsi5 : ahci [ 0.942332] ata1: SATA max UDMA/133 abar m2048@0xf8904000 port 0xf8904100 irq 46 [ 0.942336] ata2: SATA max UDMA/133 abar m2048@0xf8904000 port 0xf8904180 irq 46 [ 0.942339] ata3: DUMMY [ 0.942340] ata4: DUMMY [ 0.942344] ata5: SATA max UDMA/133 abar m2048@0xf8904000 port 0xf8904300 irq 46 [ 0.942347] ata6: SATA max UDMA/133 abar m2048@0xf8904000 port 0xf8904380 irq 46 [ 1.028061] usb 1-5: new high speed USB device number 2 using ehci_hcd [ 1.181775] usbcore: registered new interface driver uas [ 1.260062] ata1: SATA link up 3.0 Gbps (SStatus 123 SControl 300) [ 1.261126] ata1.00: ATA-8: Hitachi HTS543225L9A300, FBEOC40C, max UDMA/133 [ 1.261129] ata1.00: 488397168 sectors, multi 16: LBA48 NCQ (depth 31/32), AA [ 1.262360] ata1.00: configured for UDMA/133 [ 1.262518] scsi 0:0:0:0: Direct-Access ATA Hitachi HTS54322 FBEO PQ: 0 ANSI: 5 [ 1.262716] sd 0:0:0:0: Attached scsi generic sg0 type 0 [ 1.262762] sd 0:0:0:0: [sda] 488397168 512-byte logical blocks: (250 GB/232 GiB) [ 1.262824] sd 0:0:0:0: [sda] Write Protect is off [ 1.262827] sd 0:0:0:0: [sda] Mode Sense: 00 3a 00 00 [ 1.262851] sd 0:0:0:0: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA [ 1.287277] sda: sda1 sda2 sda3 [ 1.287693] sd 0:0:0:0: [sda] Attached SCSI disk [ 1.580059] ata2: SATA link up 1.5 Gbps (SStatus 113 SControl 300) [ 1.581188] ata2.00: ATAPI: HL-DT-STDVDRAM GT10N, 1.00, max UDMA/100 [ 1.582663] ata2.00: configured for UDMA/100 [ 1.584162] scsi 1:0:0:0: CD-ROM HL-DT-ST DVDRAM GT10N 1.00 PQ: 0 ANSI: 5 [ 1.585821] sr0: scsi3-mmc drive: 24x/24x writer dvd-ram cd/rw xa/form2 cdda tray [ 1.585824] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 1.585953] sr 1:0:0:0: Attached scsi CD-ROM sr0 [ 1.586038] sr 1:0:0:0: Attached scsi generic sg1 type 5 [ 1.632061] usb 6-1: new low speed USB device number 2 using uhci_hcd [ 1.908056] ata5: SATA link down (SStatus 0 SControl 300) [ 2.228065] ata6: SATA link down (SStatus 0 SControl 300) [ 2.228955] Initializing USB Mass Storage driver... [ 2.229052] usbcore: registered new interface driver usb-storage [ 2.229054] USB Mass Storage support registered. [ 2.235827] scsi6 : usb-storage 1-5:1.0 [ 2.235987] usbcore: registered new interface driver ums-realtek [ 2.244451] input: B16_b_02 USB-PS/2 Optical Mouse as /devices/pci0000:00/0000:00:1d.0/usb6/6-1/6-1:1.0/input/input4 [ 2.244598] generic-usb 0003:046D:C025.0001: input,hidraw0: USB HID v1.10 Mouse [B16_b_02 USB-PS/2 Optical Mouse] on usb-0000:00:1d.0-1/input0 [ 2.244620] usbcore: registered new interface driver usbhid [ 2.244622] usbhid: USB HID core driver [ 3.091083] EXT4-fs (loop0): mounted filesystem with ordered data mode. Opts: (null) [ 3.238275] scsi 6:0:0:0: Direct-Access Generic- Multi-Card 1.00 PQ: 0 ANSI: 0 CCS [ 3.348261] sd 6:0:0:0: Attached scsi generic sg2 type 0 [ 3.351897] sd 6:0:0:0: [sdb] Attached SCSI removable disk [ 47.138012] udevd[334]: starting version 173 [ 47.177678] lp: driver loaded but no devices found [ 47.197084] wmi: Mapper loaded [ 47.197526] acer_wmi: Acer Laptop ACPI-WMI Extras [ 47.210227] acer_wmi: Brightness must be controlled by generic video driver [ 47.566578] Disabling lock debugging due to kernel taint [ 47.584050] ndiswrapper version 1.56 loaded (smp=yes, preempt=no) [ 47.620666] type=1400 audit(1319239895.347:2): apparmor="STATUS" operation="profile_load" name="/sbin/dhclient" pid=624 comm="apparmor_parser" [ 47.620934] type=1400 audit(1319239895.347:3): apparmor="STATUS" operation="profile_load" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=624 comm="apparmor_parser" [ 47.621108] type=1400 audit(1319239895.347:4): apparmor="STATUS" operation="profile_load" name="/usr/lib/connman/scripts/dhclient-script" pid=624 comm="apparmor_parser" [ 47.633056] [drm] Initialized drm 1.1.0 20060810 [ 47.722594] i915 0000:00:02.0: PCI INT A -> GSI 16 (level, low) -> IRQ 16 [ 47.722602] i915 0000:00:02.0: setting latency timer to 64 [ 47.807152] ndiswrapper (check_nt_hdr:141): kernel is 64-bit, but Windows driver is not 64-bit;bad magic: 010B [ 47.807159] ndiswrapper (load_sys_files:206): couldn't prepare driver 'rt2860' [ 47.807930] ndiswrapper (load_wrap_driver:108): couldn't load driver rt2860; check system log for messages from 'loadndisdriver' [ 47.856250] usbcore: registered new interface driver ndiswrapper [ 47.861772] i915 0000:00:02.0: irq 47 for MSI/MSI-X [ 47.861781] [drm] Supports vblank timestamp caching Rev 1 (10.10.2010). [ 47.861783] [drm] Driver supports precise vblank timestamp query. [ 47.861842] vgaarb: device changed decodes: PCI:0000:00:02.0,olddecodes=io+mem,decodes=io+mem:owns=io+mem [ 47.980620] fixme: max PWM is zero. [ 48.286153] fbcon: inteldrmfb (fb0) is primary device [ 48.287033] Console: switching to colour frame buffer device 170x48 [ 48.287062] fb0: inteldrmfb frame buffer device [ 48.287064] drm: registered panic notifier [ 48.333883] acpi device:02: registered as cooling_device2 [ 48.334053] input: Video Bus as /devices/LNXSYSTM:00/device:00/PNP0A08:00/LNXVIDEO:00/input/input5 [ 48.334128] ACPI: Video Device [GFX0] (multi-head: yes rom: no post: no) [ 48.334203] [drm] Initialized i915 1.6.0 20080730 for 0000:00:02.0 on minor 0 [ 48.334644] HDA Intel 0000:00:1b.0: power state changed by ACPI to D0 [ 48.334652] HDA Intel 0000:00:1b.0: power state changed by ACPI to D0 [ 48.334673] HDA Intel 0000:00:1b.0: PCI INT A -> GSI 21 (level, low) -> IRQ 21 [ 48.334737] HDA Intel 0000:00:1b.0: irq 48 for MSI/MSI-X [ 48.334772] HDA Intel 0000:00:1b.0: setting latency timer to 64 [ 48.356107] Adding 261116k swap on /host/ubuntu/disks/swap.disk. Priority:-1 extents:1 across:261116k [ 48.380946] hda_codec: ALC268: BIOS auto-probing. [ 48.390242] input: HDA Intel Mic as /devices/pci0000:00/0000:00:1b.0/sound/card0/input6 [ 48.390365] input: HDA Intel Headphone as /devices/pci0000:00/0000:00:1b.0/sound/card0/input7 [ 48.490870] EXT4-fs (loop0): re-mounted. Opts: errors=remount-ro,user_xattr [ 48.917990] ppdev: user-space parallel port driver [ 48.950729] type=1400 audit(1319239896.675:5): apparmor="STATUS" operation="profile_load" name="/usr/lib/cups/backend/cups-pdf" pid=941 comm="apparmor_parser" [ 48.951114] type=1400 audit(1319239896.675:6): apparmor="STATUS" operation="profile_load" name="/usr/sbin/cupsd" pid=941 comm="apparmor_parser" [ 48.977706] Synaptics Touchpad, model: 1, fw: 7.2, id: 0x1c0b1, caps: 0xd04733/0xa44000/0xa0000 [ 49.048871] input: SynPS/2 Synaptics TouchPad as /devices/platform/i8042/serio1/input/input8 [ 49.078713] sky2 0000:02:00.0: eth0: enabling interface [ 49.079462] ADDRCONF(NETDEV_UP): eth0: link is not ready [ 50.762266] sky2 0000:02:00.0: eth0: Link is up at 100 Mbps, full duplex, flow control rx [ 50.762702] ADDRCONF(NETDEV_CHANGE): eth0: link becomes ready [ 54.751478] type=1400 audit(1319239902.475:7): apparmor="STATUS" operation="profile_load" name="/usr/lib/lightdm/lightdm-guest-session-wrapper" pid=1039 comm="apparmor_parser" [ 54.755907] type=1400 audit(1319239902.479:8): apparmor="STATUS" operation="profile_replace" name="/sbin/dhclient" pid=1040 comm="apparmor_parser" [ 54.756237] type=1400 audit(1319239902.483:9): apparmor="STATUS" operation="profile_replace" name="/usr/lib/NetworkManager/nm-dhcp-client.action" pid=1040 comm="apparmor_parser" [ 54.756417] type=1400 audit(1319239902.483:10): apparmor="STATUS" operation="profile_replace" name="/usr/lib/connman/scripts/dhclient-script" pid=1040 comm="apparmor_parser" [ 54.764825] type=1400 audit(1319239902.491:11): apparmor="STATUS" operation="profile_load" name="/usr/bin/evince" pid=1041 comm="apparmor_parser" [ 54.768365] type=1400 audit(1319239902.495:12): apparmor="STATUS" operation="profile_load" name="/usr/bin/evince-previewer" pid=1041 comm="apparmor_parser" [ 54.770601] type=1400 audit(1319239902.495:13): apparmor="STATUS" operation="profile_load" name="/usr/bin/evince-thumbnailer" pid=1041 comm="apparmor_parser" [ 54.770729] type=1400 audit(1319239902.495:14): apparmor="STATUS" operation="profile_load" name="/usr/share/gdm/guest-session/Xsession" pid=1038 comm="apparmor_parser" [ 54.775181] type=1400 audit(1319239902.499:15): apparmor="STATUS" operation="profile_load" name="/usr/lib/telepathy/mission-control-5" pid=1043 comm="apparmor_parser" [ 54.775533] type=1400 audit(1319239902.499:16): apparmor="STATUS" operation="profile_load" name="/usr/lib/telepathy/telepathy-*" pid=1043 comm="apparmor_parser" [ 54.936691] init: failsafe main process (891) killed by TERM signal [ 54.944583] init: apport pre-start process (1096) terminated with status 1 [ 55.000373] init: apport post-stop process (1160) terminated with status 1 [ 55.005291] init: gdm main process (1159) killed by TERM signal [ 59.782579] EXT4-fs (loop0): re-mounted. Opts: errors=remount-ro,user_xattr,commit=0 [ 60.992021] eth0: no IPv6 routers present [ 61.936072] device eth0 entered promiscuous mode [ 62.053949] Bluetooth: Core ver 2.16 [ 62.054005] NET: Registered protocol family 31 [ 62.054007] Bluetooth: HCI device and connection manager initialized [ 62.054010] Bluetooth: HCI socket layer initialized [ 62.054012] Bluetooth: L2CAP socket layer initialized [ 62.054993] Bluetooth: SCO socket layer initialized [ 62.058750] Bluetooth: RFCOMM TTY layer initialized [ 62.058758] Bluetooth: RFCOMM socket layer initialized [ 62.058760] Bluetooth: RFCOMM ver 1.11 [ 62.059428] Bluetooth: BNEP (Ethernet Emulation) ver 1.3 [ 62.059432] Bluetooth: BNEP filters: protocol multicast [ 62.460389] init: plymouth-stop pre-start process (1662) terminated with status 1 '

    Read the article

  • CodePlex Daily Summary for Friday, January 14, 2011

    CodePlex Daily Summary for Friday, January 14, 2011Popular ReleasesBloodSim: BloodSim - 1.3.3.0: NOTE: If you have a previous version of BloodSim, you must update WControls.dll from the Required DLLs download. - Removed average stats from main window as this is no longer necessary - Added ability to name a set of runs that will show up in the title bar of the graph window - Added ability to run progressive simulation sets, increasing up to 2 chosen stats by a value each time - Added option to set the amount that Death Strike heals for on the Last 5s Damage - Added option for Blood Shiel...WCF Community Site: WCF Web APIs 11.01.14: Welcome to the third release of WCF Web APIs on codeplex New Features - WCF HTTP New HttpClient API which replaces the REST Starter kit has been introduced. In addition to supporting strongly typed messages, the API supports async through Task<T>. It also has a pluggable channel stack for plugging in handlers for the request / response from the client side. See the QueryableSample for an example of the new channels. New extension methods for serializing / deserializing to/from HttpContent. ...POP Forums for ASP.NET MVC3: POP Forums v9 - Beta 1: This is the first beta for the ASP.NET MVC 3 version of POP Forums. It is nearly feature complete, and ready for testing and feedback. For previous release notes, look here, here and here. Check out the live preview: http://preview.popforums.com/Forums Setup instructions are on the home page of this project. The new hotness in the beta, or what has been done since the last preview: All views converted to use Razor E-mail subscription/notification of new posts New post indicators/mark r...mytrip.mvc (CMS & e-Commerce): mytrip.mvc 1.0.51.1 beta 2: New captcha bug fixed at install/uninstall module -> auto create/delete all files and folders for module WEB.mytrip.mvc 1.0.51.1 Web for install hosting System Requirements: NET 4.0, MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) SRC.mytrip.mvc 1.0.51.1 System Requirements: Visual Studio 2010 or Web Deweloper 2010 MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) Co...NuGet: NuGet 1.0 RTM: NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog.MVC Music Store: MVC Music Store v2.0: This is the 2.0 release of the MVC Music Store Tutorial. This tutorial is updated for ASP.NET MVC 3 and Entity Framework Code-First, and contains fixes and improvements based on feedback and common questions from previous releases. The main download, MvcMusicStore-v2.0.zip, contains everything you need to build the sample application, including A detailed tutorial document in PDF format Assets you will need to build the project, including images, a stylesheet, and a pre-populated databas...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.7 GA Released: Hi, Today we are releasing Visifire 3.6.7 GA with the following feature: * Inlines property has been implemented in Title. Also, this release contains fix for the following bugs: * In Column and Bar chart DataPoint’s label properties were not working as expected at real-time if marker enabled was set to true. * 3D Column and Bar chart were not rendered properly if AxisMinimum property was set in x-axis. You can download Visifire v3.6.7 here. Cheers, Team VisifireFluent Validation for .NET: 2.0: Changes since 2.0 RC Fix typo in the name of FallbackAwareResourceAccessorBuilder Fix issue #7062 - allow validator selectors to work against nullable properties with overriden names. Fix error in German localization. Better support for client-side validation messages in MVC integration. All changes since 1.3 Allow custom MVC ModelValidators to be added to the FVModelValidatorProvider Support resource provider for custom property validators through the new IResourceAccessorBuilder ...EnhSim: EnhSim 2.3.2 ALPHA: 2.3.1 ALPHAThis release supports WoW patch 4.03a at level 85 To use this release, you must have the Microsoft Visual C++ 2010 Redistributable Package installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=A7B7A05E-6DE6-4D3A-A423-37BF0912DB84 To use the GUI you must have the .NET 4.0 Framework installed. This can be downloaded from http://www.microsoft.com/downloads/en/details.aspx?FamilyID=9cfb2d51-5ff4-4491-b0e5-b386f32c0992 - Quick update to ...ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.6: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager new stuff: paging for the lookup lookup with multiselect changes: the css classes used by the framework where renamed to be more standard the lookup controller requries an item.ascx (no more ViewData["structure"]), and LookupList action renamed to Search all the...??????????: All-In-One Code Framework ??? 2011-01-12: 2011???????All-In-One Code Framework(??) 2011?1??????!!http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 ?????release?,???????ASP.NET, AJAX, WinForm, Windows Shell????13?Sample Code。???,??????????sample code。 ?????:http://blog.csdn.net/sjb5201/archive/2011/01/13/6135037.aspx ??,??????MSDN????????????。 http://social.msdn.microsoft.com/Forums/zh-CN/codezhchs/threads ?????????????????,??Email ????patterns & practices – Enterprise Library: Enterprise Library 5.0 - Extensibility Labs: This is a preview release of the Hands-on Labs to help you learn and practice different ways the Enterprise Library can be extended. Learning MapCustom exception handler (estimated time to complete: 1 hr 15 mins) Custom logging trace listener (1 hr) Custom configuration source (registry-based) (30 mins) System requirementsEnterprise Library 5.0 / Unity 2.0 installed SQL Express 2008 installed Visual Studio 2010 Pro (or better) installed AuthorsChris Tavares, Microsoft Corporation ...Orchard Project: Orchard 1.0: Orchard Release Notes Build: 1.0.20 Published: 1/12/2010 How to Install OrchardTo install the Orchard tech preview using Web PI, follow these instructions: http://www.orchardproject.net/docs/Installing-Orchard.ashx Web PI will detect your hardware environment and install the application. --OR-- Alternatively, to install the release manually, download the Orchard.Web.1.0.20.zip file. The zip contents are pre-built and ready-to-run. Simply extract the contents of the Orchard folder from ...Umbraco CMS: Umbraco 4.6.1: The Umbraco 4.6.1 (codename JUNO) release contains many new features focusing on an improved installation experience, a number of robust developer features, and contains nearly 200 bug fixes since the 4.5.2 release. Getting Started A great place to start is with our Getting Started Guide: Getting Started Guide: http://umbraco.codeplex.com/Project/Download/FileDownload.aspx?DownloadId=197051 Make sure to check the free foundation videos on how to get started building Umbraco sites. They're ...StyleCop for ReSharper: StyleCop for ReSharper 5.1.14986.000: A considerable amount of work has gone into this release: Features: Huge focus on performance around the violation scanning subsystem: - caching added to reduce IO operations around reading and merging of settings files - caching added to reduce creation of expensive objects Users should notice condsiderable perf boost and a decrease in memory usage. Bug Fixes: - StyleCop's new ObjectBasedEnvironment object does not resolve the StyleCop installation path, thus it does not return the ...Facebook C# SDK: 4.2.1: - Authentication bug fixes - Updated Json.Net to version 4.0.0 - BREAKING CHANGE: Removed cookieSupport config setting, now automatic. This download is also availible on NuGet: Facebook FacebookWeb FacebookWebMvcPhalanger - The PHP Language Compiler for the .NET Framework: 2.0 (January 2011): Another release build for daily use; it contains many new features, enhanced compatibility with latest PHP opensource applications and several issue fixes. To improve the performance of your application using MySQL, please use Managed MySQL Extension for Phalanger. Changes made within this release include following: New features available only in Phalanger. Full support of Multi-Script-Assemblies was implemented; you can build your application into several DLLs now. Deploy them separately t...AutoLoL: AutoLoL v1.5.3: A message will be displayed when there's an update available Shows a list of recent mastery files in the Editor Tab (requested by quite a few people) Updater: Update information is now scrollable Added a buton to launch AutoLoL after updating is finished Updated the UI to match that of AutoLoL Fix: Detects and resolves 'Read Only' state on Version.xmlASP.NET: ASP.NET MVC 3 RTM: This release contains the source code for ASP.NET MVC 2 RTM as well as the ASP.NET MVC Futures project. The futures project contains features that the ASP.NET MVC team is considering for a future release of ASP.NET MVC This release contains the source code for ASP.NET MVC 3 RTM ASP.NET Web Pages with Razor Syntax ASP.NET MVC 3 Futures For more details about ASP.NET MVC 3, please visit http://asp.net/mvc/mvc3 as well as the Release NotesExtended WPF Toolkit: Extended WPF Toolkit - 1.3.0: What's in the 1.3.0 Release?BusyIndicator ButtonSpinner ChildWindow ColorPicker - Updated (Breaking Changes) DateTimeUpDown - New Control Magnifier - New Control MaskedTextBox - New Control MessageBox NumericUpDown RichTextBox RichTextBoxFormatBar - Updated .NET 3.5 binaries and SourcePlease note: The Extended WPF Toolkit 3.5 is dependent on .NET Framework 3.5 and the WPFToolkit. You must install .NET Framework 3.5 and the WPFToolkit in order to use any features in the To...New ProjectsAny Sudoku Solver: Program to solving any existing sudoku puzzle.CeairCarbin: One Project About A Carbin Department Manager System About News WrokFlow Saralechat room: test technology on webECTS Multi Server - No Domain Group Policy enabled: These bits help out eliminating exceptions that occur when ECTS is deployed with no organization group password policy .. Especially eliminating this error : "Negating the minimum value of a twos complement number is invalid " EF4 Linq Extensions: Extensions for Entity Framework 4 to support things like Enums in Linq queriesENUH09 v2: Felles prosjekt for ENUH09EWS FastTransfer Stream Parser: This is a sample project to demonstrate how the fasttransfer Stream that is created using the ExportItems and UploadItems Exchange Web Services operations can be parsed to show the underlying properties that make up this stream. FRC3792: JCPennys/University of Missouri - Columbia/ University extensions FRC Source Control.FYP: FTP FYP ProjectGastrOS - openEHR based Endoscopy Application: GastrOS is an endoscopic reporting application based on open standards: openEHR and MST. GUI is driven by Archetypes/Templates. It is part of our research at the University of Auckland to investigate software maintainability and interoperability. Uses openEHR.Net on CodePlexgoldenlion: This is the project for study.im2011: ??????ImEngine: FI softwareMars Tanks: Fall 2010 C Sc 335 Final Project, University of Arizona.MijnAlbum.nl Downloader: MijnAlbum.nl Downloader is een programma waarmee je in één keer een compleet album van mijnalbum.nl op kunt slaan. NetReports: NetReport is a free open-source project to create either reportsNScript: NScriptooMetaWeblog: MetaWebLogger is a .NET C# library that makes it really simple to include in your blog engine so that you can use Windows Live Writer or other blog writing tools that support the MetaWeblog API to write and edit posts in your blogging engine The library is developed in .NET C# 4Personal Library: libraryphantom images (memories): A project I have written long ago Images you take as reminders of things that you learn , this can include a definition you read from book, a chart, a proverb or an ayat from Quran. They will popup in a semitransparent window that u can click throw. It is like a ghost window. Remote Directory Sync: Remote Directory Sync will allow users to publish and subscribe to repositories in order to keep directories in Sync. Security will be a focus of this project and communication channels will be encrypted. PNRP will be used to decentralize the communication.RiaType: RiaType suggests several architecture guidances to industrialize ria developments with Silverlight and WCF RIA ServicesSetFileDate: This is just a simple app to set some File DateTime properties.SGFConvert: Utility to modify SGF (Smart Game Format) files.Shiny Wizard, control for WPF & Silverlight: Shiny Wizard makes it easier for WPF & Silverlight .NET Developers to create non trivial wizard flows using declarative approach. Shiny Wizard control is MVVM enabled.Silverlight book viewer: Silverlight book viewer allows users to find and watch books using Silverlight Deep Zoom technology. It is used for publishing antique books from Kyiv Taras Shevchenko National University's library.Silverlight Gauge Control: Gauge Control For SilverlightSQL DataTransfer: This project is aimed for people who need to downgrade sql databases, using the SQL Management object libary. The UI is WPF, based on the MVVM design pattern.SQL Server Health Check: This Health Check App. can examine the health of the SQL Server databases in an organisation and consists of a number of checks: • Installation check • Database check • Performance check • Backup check • Security checkStarcraft Calendar: A calendar application for Starcraft community , based on Team Liquid Calendar Xml & Service.test_808: testUse the Scene Editor of Unity 3D in other non-visual environment to create games: Use the Scene Editor of Unity 3D in other non-visual environments to create 2d or 3d games.WF4: Samples for WF4Window CE Component Wizard: Wizard to create Windows CE catalog component.XBMC Connect .Net: The goal of this project to create a .Net client API to consume the REST based API of XMBC. This will API gives access to all the functionality of XBMC (video and music library, changing settings, ...) without having to handle the details related to the web service communication.Xeno3D 2.0: Xeno3D 2.0 is a direct evolution of CTP 2 of xeno found on this site. Why a new project? Because it is almost entirely new, including a brand new set of editors. Including a node based shader editor. Still early days, though. I'll update the site and svn often. Yet Another OSX Dock Wannabe: Create a simple OSX Dock-like component for Silverlight 4 that is made freely available to the user community.

    Read the article

  • CodePlex Daily Summary for Friday, July 06, 2012

    CodePlex Daily Summary for Friday, July 06, 2012Popular ReleasesTaskScheduler ASP.NET: Release 3 - 1.2.0.0: Release 3 - Version 1.2.0.0 That version was altered only the library: In TaskScheduler was added new properties: UseBackgroundThreads Enables the use of separate threads for each task. StoreThreadsInPool Manager enables to store in the Pool threads that are performing the tasks. OnStopSchedulerAutoCancelThreads Scheduler allows aborting threads when it is stopped. false if the scheduler is not aborted the threads that are running. AutoDeletedExecutedTasks Allows Manager Delete Task afte...xUnit.net Contrib: xunitcontrib-resharper 0.6 (RS 7.0, 6.1.1): xunitcontrib release 0.6 (ReSharper runner) This release provides a test runner plugin for Resharper 7.0 (EAP build 82) and 6.1, targetting all versions of xUnit.net. (See the xUnit.net project to download xUnit.net itself.) Copies of the plugin that support previous verions of ReSharper can be downloaded from this release. The plan is to support the latest revisions of the last two paid-for major versions of ReSharper (namely 7.0 and 6.1) Also note that all builds work against ALL VERSIONS...Umbraco CMS: Umbraco 4.8.0 Beta: Whats newuComponents in the core Multi-Node Tree Picker, Multiple Textstring, Slider and XPath Lists Easier Lucene searching built in IFile providers for easier file handling Updated 3rd party libraries Applications / Trees moved out of the database SQL Azure support added Various bug fixes Getting Started A great place to start is with our Getting Started Guide: Getting Started Guide: http://umbraco.codeplex.com/Project/Download/FileDownload.aspx?DownloadId=197051 Make sure to...CODE Framework: 4.0.20704.0: See CODE Framework (.NET) Change Log for changes in this version.?????????? - ????????: All-In-One Code Framework ??? 2012-07-04: http://download.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1codechs&DownloadId=216140 ???OneCode??????,??????????10????Microsoft OneCode Sample,????4?Windows Base Sample,2?XML Sample?4?ASP.NET Sample。???????????。 ????,?????。http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 Windows Base Sample CSCheckOSBitness VBCheckOSBitness CSCheckOSVersion VBCheckOSVersion XML Sample CSXPath VBXPath ASP.NET Sample CSASPNETDataPager VBASPNET...sheetengine - Isometric HTML5 JavaScript Display Engine: sheetengine v1.0: The first release of sheetengine. See sheetengine.codeplex.com for a list of features and examples.AssaultCube Reloaded: 2.5.1 Intrepid Fixed: Linux has Ubuntu 11.10 32-bit precompiled binaries and Ubuntu 10.10 64-bit precompiled binaries, but you can compile your own as it also contains the source. If you are using Mac or other operating systems, download the Linux package. Try to compile it. If it fails, download a virtual machine. The server pack is ready for both Windows and Linux, but you might need to compile your own for Linux (source included) If you use the default maprot or any maprot, you need to fix it Well, 2.5 was...xUnit.net - Unit testing framework for C# and .NET (a successor to NUnit): xUnit.net 1.9.1: xUnit.net release 1.9.1Build #1600 Important note for Resharper users: Resharper support has been moved to the xUnit.net Contrib project. Important note for TestDriven.net users: If you are having issues running xUnit.net tests in TestDriven.net, especially on 64-bit Windows, we strongly recommend you upgrade to TD.NET version 3.0 or later. Important note for VS2012 users: The VS2012 runner is in the Visual Studio Gallery now, and should be installed via Tools | Extension Manager from insi...MVC Controls Toolkit: Mvc Controls Toolkit 2.2.0: Added Modified all Mv4 related features to conform with the Mvc4 RC Now all items controls accept any IEnumerable<T>(before just List<T> were accepted by most of controls) retrievalManager class that retrieves automatically data from a data source whenever it catchs events triggered by filtering, sorting, and paging controls move method to the updatesManager to move one child objects from a father to another. The move operation can be undone like the insert, update and delete operatio...D3API.Net: TESTING TOOLS (PRE-BLIZZARD API RELEASE): PLEASE NOTE: This release is COMPLETELY SEPARATE from the API. It is intended only so development with this API can begin. This will not be a maintained part of the project. (The Test Application might evolve, but the Test API will not) This release is to address the issue that since Blizzard hasn't released the API, you cannot test this API. Because of this, I decided to create a Test Application AND a Test API that includes the following: Test Application: --Has built in examples from Bl...RTF DOM Parser: 2012-7-3 Relasese: Fix some bug when parse RTFBlackJumboDog: Ver5.6.6: 2012.07.03 Ver5.6.6 (1) ???????????ftp://?????????、????LIST?????Mini SQL Query: Mini SQL Query (v1.0.68.441): Just a bug fix release for when the connections try to refresh after an edit. Make sure you read the Quickstart for an introduction.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.58: Fix for Issue #18296: provide "ALL" value to the -ignore switch to ignore all error and warning messages. Fix for issue #18293: if encountering EOF before a function declaration or expression is properly closed, throw an appropriate error and don't crash. Adjust the variable-renaming algorithm so it's very specific when renaming variables with the same number of references so a single source file ends up with the same minified names on different platforms. add the ability to specify kno...LogExpert: 1.4 build 4566: This release for the 1.4 version line contains various fixes which have been made some times ago. Until now these fixes were only available in the 1.5 alpha versions. It also contains a fix for: 710. Column finder (press F8 to show) Terminal server issues: Multiple sessions with same user should work now Settings Export/Import available via Settings Dialog still incomple (e.g. tab colors are not saved) maybe I change the file format one day no command line support yet (for importin...CommonLibrary.NET: CommonLibrary.NET 0.9.8.5 - Final Release: A collection of very reusable code and components in C# 4.0 ranging from ActiveRecord, Csv, Command Line Parsing, Configuration, Holiday Calendars, Logging, Authentication, and much more. FluentscriptCommonLibrary.NET 0.9.8 contains a scripting language called FluentScript. Releases notes for FluentScript located at http://fluentscript.codeplex.com/wikipage?action=Edit&title=Release%20Notes&referringTitle=Documentation Fluentscript - 0.9.8.5 - Final ReleaseApplication: FluentScript Versio...SharePoint 2010 Metro UI: SharePoint 2010 Metro UI8: Please review the documentation link for how to install. Installation takes some basic knowledge of how to upload and edit SharePoint Artifact files. Please view the discussions tab for ongoing FAQsnopCommerce. Open source shopping cart (ASP.NET MVC): nopcommerce 2.60: Highlight features & improvements: • Significant performance optimization. • Use AJAX for adding products to the cart. • New flyout mini-shopping cart. • Auto complete suggestions for product searching. • Full-Text support. • EU cookie law support. To see the full list of fixes and changes please visit the release notes page (http://www.nopCommerce.com/releasenotes.aspx).THE NVL Maker: The NVL Maker Ver 3.51: http://download.codeplex.com/Download?ProjectName=nvlmaker&DownloadId=371510 ????:http://115.com/file/beoef05k#THE-NVL-Maker-ver3.51-sim.7z ????:http://www.mediafire.com/file/6tqdwj9jr6eb9qj/THENVLMakerver3.51tra.7z ======================================== ???? ======================================== 3.51 beta ???: ·?????????????????????? ·?????????,?????????0,?????????????????????? ·??????????????????????????? ·?????????????TJS????(EXP??) ·??4:3???,???????????????,??????????? ·?????????...????: ????2.0.3: 1、???????????。 2、????????。 3、????????????。 4、bug??,????。New Projects40FINGERS DotNetNuke Demo Skins: Collection of DotNetNuke Demo Skins, create for you by Timo Breumelhof of 40FINGERS. Check out the individual downloads for more informationASP.NET Virtual Templates: This project allows you to provide files like views, stylsheets and scripts embedded in an assembly to any web application by using the virtual file system.Bauble: Bauble is a dock launcher written in C# utilizing WPF. As a launcher, it contains an animated list application icons, and will open their program on click.BBQ Assistant: Project to create and maintain a Windows Phone application to allow users to enter BBQ events and record timelines.CharmFlyout - A Metro Flyout Custom Control: A custom control for displaying flyouts from the settings charm in Windows Metro style (WinRT) applications written in C# / XAML.dotNetDR_Auth2????API????: This is SUMMARYEntacts: Entacts app is a contact app for electronic contacts.FlMML customized: FlMML customized ?、FlMML?MML?????????????????????。 FlMML?Flash?????????????????。 MML????????????????????????????。 FluidDb: FluidDb is a better microORM. Unique features, excellent performance, and cleaner code in as few trips to the database as possible.Gabe's gubb Framework (GGF): GGF is a set of classes built to help you work with the REST-based gubb API(http://gubb.net). gubb is a list management site similar to (better than?) Remember the Milk. The core of GGF allows for object/transaction modeling and facilitation of HTTP-based requests. Written in C#.Grandshot 2: Grandshot 2 is an awesome 2D shooter with a great ragdoll and animation system, vehicles and lots of blood and gore. Written in VB.NetJason's CG: This is jason's CGJQS: This is a simple WriterService.Lincoln Wood: An evolutionary implementation of the next gen Lincoln Wood Community environment using MVC2 and other good stuff.Microsoft CRM PluginQuickDeploy: Small tool for deploy your CRM 2011 plugin very fast, especially in the development process. It can also be added into the build event in Visual Studio 2010.Morus: socialMouseBot: Prevent a PC from sleeping the silly way: move the mouse cursor on a timer!QIF AS9102 Form Design Study: This is a Visual Studio 2010 C# Winform application that uses a simple AS9102 form as a design study for consuming (C3) and producing (C2) QIF sample xml files.RaveIt: Windows phone 7 drumm machine appRESTFunctoids: RESTFunctoids for BizTalk 2010 allows you to consume REST Services directly from your map.Secure(): Secure() MS Repo This repository will host Microsoft-oriented code from my site Secure() at http://nathanv.comSharpMik: SharpMik is a library to play Amiga music using C#SharpSyslog: Syslog server lib for .Net/C# (v3.5) implementing RFC 5424 The Syslog Protocol. SuperMetroQuiz: O SuperQuiz é um jogo de Quiz para Windows 8, desenvolvido em C#, que utilizou como base o template grid, disponível no Visual Studio 2012 RC. takela: An ASP.Net MVC Razor Project. TaskScheduler ASP.NET: Simple Example of how to schedule tasks in ASP.NET. works in WebForms, MVC and others, dont need requests or infinite loops. provides full control over the taskTeenyGrab: Take screenshots and upload them to FTP server at the touch of a button.testdd07052012git01: cxvtestdd07052012git1: xzctestdd07052012hg01: cvtestddhg0705201201: xzctesttfs07052012tfs01: zxtesttom07052012git02: rfeThTa7Maged: Its Point of sale project TX264: A GUI for x264, ffmpeg, lame, faac, qaac, neroaacenc, oggenc, aften, lame, flac, mp4box and mkvtoolnix.Visual Studio Extension - Collapse Solution: Visual Studio extension that collapses every item in the Solution Explorer tool window at the solution or project level.Visual Studio Extension - Enable Code Analysis: Visual Studio extension that turns Code Analysis On or Off for all projects in the solution.VocalsBase: not foundWPF Active Directory Explorer: Robust and Extensible Active Directory Explorer and Editoryeg: . Net deneme

    Read the article

  • The 20 Most Important Keyboard Shortcuts For Windows PCs

    - by Chris Hoffman
    Keyboard shortcuts are practically essential for using any type of PC. They’ll speed up almost everything you do. But long lists of keyboard shortcuts can quickly become overwhelming if you’re just getting started. This list will cover the most useful keyboard shortcuts that every Windows user should know. If you haven’t used keyboard shortcuts much, these will show you just how useful keyboard shortcuts can be. Windows Key + Search The Windows key is particularly important on Windows 8 — especially before Windows 8.1 — because it allows you to quickly return to the Start screen. On Windows 7, it opens the Start menu. Either way, you can start typing immediately after you press the Windows key to search for programs, settings, and files. For example, if you want to launch Firefox, you can press the Windows key, start typing the word Firefox, and press Enter when the Firefox shortcut appears. It’s a quick way to launch programs, open files, and locate Control Panel options without even touching your mouse and without digging through a cluttered Start menu. You can also use the arrow keys to select the shortcut you want to launch before pressing Enter. Copy, Cut, Paste Copy, Cut, and Paste are extremely important keyboard shortcuts for text-editing. If you do any typing on your computer, you probably use them. These options can be accessed using the mouse, either by right-clicking on selected text or opening the application’s Edit menu, but this is the slowest way to do it. After selecting some text, press Ctrl+C to copy it or Ctrl+X to cut it. Position the cursor where you want the text and use Ctrl+V to paste it. These shortcuts can save you a huge amount of time over using the mouse. Search the Current Page or File To quickly perform a search in the current application — whether you’re in a web browser, PDF viewer, document editor, or almost any other type of application — press Ctrl+F. The application’s search (or “Find”) feature will pop up, and you can instantly start typing a phrase you want to search for. You can generally press Enter to  go to the next appearance of the word or phrase in the document, quickly searching through it for what you’re interested in. Switch Between Applications and Tabs Rather than clicking buttons on your taskbar, Alt+Tab is a very quick way to switch between running applications. Windows orders the list of open windows by the order you accessed them, so if you’re only using two different applications, you can just press Alt+Tab to quickly switch between them. If switching between more than two windows, you’ll have to hold the Alt key and press Tab repeatedly to toggle through the list of open windows. If you miss the window you want, you can always press Alt+Shift+Tab to move through the list in reverse. To move between tabs in an application — such as the browser tabs in your web browser — press Ctrl+Tab. Ctrl+Shift+Tab will move through tabs in reverse. Quickly Print If you’re the kind of person who still prints things, you can quickly open the print window by pressing Ctrl+P. This can be faster than hunting down the Print option in every program you want to print something from. Basic Browser Shortcuts Web browser shortcuts can save you tons of time, too. Ctrl+T is a very useful one, as it will open a new tab with the address bar focused, so you can quickly press Ctrl +T, type a search phrase or web address, and press Enter to go there. To go back or forward while browsing, hold the Ctrl key and press the left or right arrow keys. If you’d just like to focus your web browser’s address bar so you can type a new web address or search without opening a new tab, press Ctrl + L. You can then start typing something and press Enter. Close Tabs and Windows To quickly close the current application, press Alt+F4. This works on the desktop and even in new Windows 8-style applications. To quickly close the current browser tab or document, press Ctrl+W. This will often close the current window if there are no other tabs open. Lock Your Computer When you’re done using your computer and want to step away, you may want to lock it. People won’t be able to log in and access your desktop unless they know your password. You can do this from the Start menu or Start screen, but the fastest way to lock your screen is by quickly pressing Windows Key + L before you get up. Access the Task Manager Ctrl+Alt+Delete will take you to a screen that allows you to quickly launch the Task Manager or perform other operations, such as signing out. This is particularly useful because if can be used to recover from situations where your computer doesn’t appear responsive or isn’t accepting input. For example, if a full-screen game becomes unresponsive, Ctrl+Alt+Delete will often allow you to escape from it and end it via the Task Manager. Windows 8 Shortcuts On Windows 8 PCs, there are other very important keyboard shortcuts. Windows Key + C will open your Charms bar, while Windows Key + Tab will open the new App Switcher. These keyboard shortcuts will allow you to avoid the hot corners, which can be tedious to use with a mouse. On the desktop side, Windows Key + D will take you back to the desktop from anywhere. Windows Key + X will open a special “power user menu” that gives you quick access to options that are hidden in the new Windows 8 interface, including Shut Down, Restart, and Control Panel. If you’re interested in learning more keyboard shortcuts, be sure to check our longer lists of 47 keyboard shortcuts that work in all web browsers and 42+ keyboard shortcuts to speed up text-editing. Image Credit: Jeroen Bennink on Flickr     

    Read the article

  • Windows Azure: Major Updates for Mobile Backend Development

    - by ScottGu
    This week we released some great updates to Windows Azure that make it significantly easier to develop mobile applications that use the cloud. These new capabilities include: Mobile Services: Custom API support Mobile Services: Git Source Control support Mobile Services: Node.js NPM Module support Mobile Services: A .NET API via NuGet Mobile Services and Web Sites: Free 20MB SQL Database Option for Mobile Services and Web Sites Mobile Notification Hubs: Android Broadcast Push Notification Support All of these improvements are now available to use immediately (note: some are still in preview).  Below are more details about them. Mobile Services: Custom APIs, Git Source Control, and NuGet Windows Azure Mobile Services provides the ability to easily stand up a mobile backend that can be used to support your Windows 8, Windows Phone, iOS, Android and HTML5 client applications.  Starting with the first preview we supported the ability to easily extend your data backend logic with server side scripting that executes as part of client-side CRUD operations against your cloud back data tables. With today’s update we are extending this support even further and introducing the ability for you to also create and expose Custom APIs from your Mobile Service backend, and easily publish them to your Mobile clients without having to associate them with a data table. This capability enables a whole set of new scenarios – including the ability to work with data sources other than SQL Databases (for example: Table Services or MongoDB), broker calls to 3rd party APIs, integrate with Windows Azure Queues or Service Bus, work with custom non-JSON payloads (e.g. Windows Periodic Notifications), route client requests to services back on-premises (e.g. with the new Windows Azure BizTalk Services), or simply implement functionality that doesn’t correspond to a database operation.  The custom APIs can be written in server-side JavaScript (using Node.js) and can use Node’s NPM packages.  We will also be adding support for custom APIs written using .NET in the future as well. Creating a Custom API Adding a custom API to an existing Mobile Service is super easy.  Using the Windows Azure Management Portal you can now simply click the new “API” tab with your Mobile Service, and then click the “Create a Custom API” button to create a new Custom API within it: Give the API whatever name you want to expose, and then choose the security permissions you’d like to apply to the HTTP methods you expose within it.  You can easily lock down the HTTP verbs to your Custom API to be available to anyone, only those who have a valid application key, only authenticated users, or administrators.  Mobile Services will then enforce these permissions without you having to write any code: When you click the ok button you’ll see the new API show up in the API list.  Selecting it will enable you to edit the default script that contains some placeholder functionality: Today’s release enables Custom APIs to be written using Node.js (we will support writing Custom APIs in .NET as well in a future release), and the Custom API programming model follows the Node.js convention for modules, which is to export functions to handle HTTP requests. The default script above exposes functionality for an HTTP POST request. To support a GET, simply change the export statement accordingly.  Below is an example of some code for reading and returning data from Windows Azure Table Storage using the Azure Node API: After saving the changes, you can now call this API from any Mobile Service client application (including Windows 8, Windows Phone, iOS, Android or HTML5 with CORS). Below is the code for how you could invoke the API asynchronously from a Windows Store application using .NET and the new InvokeApiAsync method, and data-bind the results to control within your XAML:     private async void RefreshTodoItems() {         var results = await App.MobileService.InvokeApiAsync<List<TodoItem>>("todos", HttpMethod.Get, parameters: null);         ListItems.ItemsSource = new ObservableCollection<TodoItem>(results);     }    Integrating authentication and authorization with Custom APIs is really easy with Mobile Services. Just like with data requests, custom API requests enjoy the same built-in authentication and authorization support of Mobile Services (including integration with Microsoft ID, Google, Facebook and Twitter authentication providers), and it also enables you to easily integrate your Custom API code with other Mobile Service capabilities like push notifications, logging, SQL, etc. Check out our new tutorials to learn more about to use new Custom API support, and starting adding them to your app today. Mobile Services: Git Source Control Support Today’s Mobile Services update also enables source control integration with Git.  The new source control support provides a Git repository as part your Mobile Service, and it includes all of your existing Mobile Service scripts and permissions. You can clone that git repository on your local machine, make changes to any of your scripts, and then easily deploy the mobile service to production using Git. This enables a really great developer workflow that works on any developer machine (Windows, Mac and Linux). To use the new support, navigate to the dashboard for your mobile service and select the Set up source control link: If this is your first time enabling Git within Windows Azure, you will be prompted to enter the credentials you want to use to access the repository: Once you configure this, you can switch to the configure tab of your Mobile Service and you will see a Git URL you can use to use your repository: You can use this URL to clone the repository locally from your favorite command line: > git clone https://scottgutodo.scm.azure-mobile.net/ScottGuToDo.git Below is the directory structure of the repository: As you can see, the repository contains a service folder with several subfolders. Custom API scripts and associated permissions appear under the api folder as .js and .json files respectively (the .json files persist a JSON representation of the security settings for your endpoints). Similarly, table scripts and table permissions appear as .js and .json files, but since table scripts are separate per CRUD operation, they follow the naming convention of <tablename>.<operationname>.js. Finally, scheduled job scripts appear in the scheduler folder, and the shared folder is provided as a convenient location for you to store code shared by multiple scripts and a few miscellaneous things such as the APNS feedback script. Lets modify the table script todos.js file so that we have slightly better error handling when an exception occurs when we query our Table service: todos.js tableService.queryEntities(query, function(error, todoItems){     if (error) {         console.error("Error querying table: " + error);         response.send(500);     } else {         response.send(200, todoItems);     }        }); Save these changes, and now back in the command line prompt commit the changes and push them to the Mobile Services: > git add . > git commit –m "better error handling in todos.js" > git push Once deployment of the changes is complete, they will take effect immediately, and you will also see the changes be reflected in the portal: With the new Source Control feature, we’re making it really easy for you to edit your mobile service locally and push changes in an atomic fashion without sacrificing ease of use in the Windows Azure Portal. Mobile Services: NPM Module Support The new Mobile Services source control support also allows you to add any Node.js module you need in the scripts beyond the fixed set provided by Mobile Services. For example, you can easily switch to use Mongo instead of Windows Azure table in our example above. Set up Mongo DB by either purchasing a MongoLab subscription (which provides MongoDB as a Service) via the Windows Azure Store or set it up yourself on a Virtual Machine (either Windows or Linux). Then go the service folder of your local git repository and run the following command: > npm install mongoose This will add the Mongoose module to your Mobile Service scripts.  After that you can use and reference the Mongoose module in your custom API scripts to access your Mongo database: var mongoose = require('mongoose'); var schema = mongoose.Schema({ text: String, completed: Boolean });   exports.get = function (request, response) {     mongoose.connect('<your Mongo connection string> ');     TodoItemModel = mongoose.model('todoitem', schema);     TodoItemModel.find(function (err, items) {         if (err) {             console.log('error:' + err);             return response.send(500);         }         response.send(200, items);     }); }; Don’t forget to push your changes to your mobile service once you are done > git add . > git commit –m "Switched to use Mongo Labs" > git push Now our Mobile Service app is using Mongo DB! Note, with today’s update usage of custom Node.js modules is limited to Custom API scripts only. We will enable it in all scripts (including data and custom CRON tasks) shortly. New Mobile Services NuGet package, including .NET 4.5 support A few months ago we announced a new pre-release version of the Mobile Services client SDK based on portable class libraries (PCL). Today, we are excited to announce that this new library is now a stable .NET client SDK for mobile services and is no longer a pre-release package. Today’s update includes full support for Windows Store, Windows Phone 7.x, and .NET 4.5, which allows developers to use Mobile Services from ASP.NET or WPF applications. You can install and use this package today via NuGet. Mobile Services and Web Sites: Free 20MB Database for Mobile Services and Web Sites Starting today, every customer of Windows Azure gets one Free 20MB database to use for 12 months free (for both dev/test and production) with Web Sites and Mobile Services. When creating a Mobile Service or a Web Site, simply chose the new “Create a new Free 20MB database” option to take advantage of it: You can use this free SQL Database together with the 10 free Web Sites and 10 free Mobile Services you get with your Windows Azure subscription, or from any other Windows Azure VM or Cloud Service. Notification Hubs: Android Broadcast Push Notification Support Earlier this year, we introduced a new capability in Windows Azure for sending broadcast push notifications at high scale: Notification Hubs. In the initial preview of Notification Hubs you could use this support with both iOS and Windows devices.  Today we’re excited to announce new Notification Hubs support for sending push notifications to Android devices as well. Push notifications are a vital component of mobile applications.  They are critical not only in consumer apps, where they are used to increase app engagement and usage, but also in enterprise apps where up-to-date information increases employee responsiveness to business events.  You can use Notification Hubs to send push notifications to devices from any type of app (a Mobile Service, Web Site, Cloud Service or Virtual Machine). Notification Hubs provide you with the following capabilities: Cross-platform Push Notifications Support. Notification Hubs provide a common API to send push notifications to iOS, Android, or Windows Store at once.  Your app can send notifications in platform specific formats or in a platform-independent way.  Efficient Multicast. Notification Hubs are optimized to enable push notification broadcast to thousands or millions of devices with low latency.  Your server back-end can fire one message into a Notification Hub, and millions of push notifications can automatically be delivered to your users.  Devices and apps can specify a number of per-user tags when registering with a Notification Hub. These tags do not need to be pre-provisioned or disposed, and provide a very easy way to send filtered notifications to an infinite number of users/devices with a single API call.   Extreme Scale. Notification Hubs enable you to reach millions of devices without you having to re-architect or shard your application.  The pub/sub routing mechanism allows you to broadcast notifications in a super-efficient way.  This makes it incredibly easy to route and deliver notification messages to millions of users without having to build your own routing infrastructure. Usable from any Backend App. Notification Hubs can be easily integrated into any back-end server app, whether it is a Mobile Service, a Web Site, a Cloud Service or an IAAS VM. It is easy to configure Notification Hubs to send push notifications to Android. Create a new Notification Hub within the Windows Azure Management Portal (New->App Services->Service Bus->Notification Hub): Then register for Google Cloud Messaging using https://code.google.com/apis/console and obtain your API key, then simply paste that key on the Configure tab of your Notification Hub management page under the Google Cloud Messaging Settings: Then just add code to the OnCreate method of your Android app’s MainActivity class to register the device with Notification Hubs: gcm = GoogleCloudMessaging.getInstance(this); String connectionString = "<your listen access connection string>"; hub = new NotificationHub("<your notification hub name>", connectionString, this); String regid = gcm.register(SENDER_ID); hub.register(regid, "myTag"); Now you can broadcast notification from your .NET backend (or Node, Java, or PHP) to any Windows Store, Android, or iOS device registered for “myTag” tag via a single API call (you can literally broadcast messages to millions of clients you have registered with just one API call): var hubClient = NotificationHubClient.CreateClientFromConnectionString(                   “<your connection string with full access>”,                   "<your notification hub name>"); hubClient.SendGcmNativeNotification("{ 'data' : {'msg' : 'Hello from Windows Azure!' } }", "myTag”); Notification Hubs provide an extremely scalable, cross-platform, push notification infrastructure that enables you to efficiently route push notification messages to millions of mobile users and devices.  It will make enabling your push notification logic significantly simpler and more scalable, and allow you to build even better apps with it. Learn more about Notification Hubs here on MSDN . Summary The above features are now live and available to start using immediately (note: some of the services are still in preview).  If you don’t already have a Windows Azure account, you can sign-up for a free trial and start using them today.  Visit the Windows Azure Developer Center to learn more about how to build apps with it. Hope this helps, Scott P.S. In addition to blogging, I am also now using Twitter for quick updates and to share links. Follow me at: twitter.com/scottgu

    Read the article

  • Waterfall Model (SDLC) vs. Prototyping Model

    The characters in the fable of the Tortoise and the Hare can easily be used to demonstrate the similarities and differences between the Waterfall and Prototyping software development models. This children fable is about a race between a consistently slow moving but steadfast turtle and an extremely fast but unreliable rabbit. After closely comparing each character’s attributes in correlation with both software development models, a trend seems to appear in that the Waterfall closely resembles the Tortoise in that Waterfall Model is typically a slow moving process that is broken up in to multiple sequential steps that must be executed in a standard linear pattern. The Tortoise can be quoted several times in the story saying “Slow and steady wins the race.” This is the perfect mantra for the Waterfall Model in that this model is seen as a cumbersome and slow moving. Waterfall Model Phases Requirement Analysis & Definition This phase focuses on defining requirements for a project that is to be developed and determining if the project is even feasible. Requirements are collected by analyzing existing systems and functionality in correlation with the needs of the business and the desires of the end users. The desired output for this phase is a list of specific requirements from the business that are to be designed and implemented in the subsequent steps. In addition this phase is used to determine if any value will be gained by completing the project. System Design This phase focuses primarily on the actual architectural design of a system, and how it will interact within itself and with other existing applications. Projects at this level should be viewed at a high level so that actual implementation details are decided in the implementation phase. However major environmental decision like hardware and platform decision are typically decided in this phase. Furthermore the basic goal of this phase is to design an application at the system level in those classes, interfaces, and interactions are defined. Additionally decisions about scalability, distribution and reliability should also be considered for all decisions. The desired output for this phase is a functional  design document that states all of the architectural decisions that have been made in regards to the project as well as a diagrams like a sequence and class diagrams. Software Design This phase focuses primarily on the refining of the decisions found in the functional design document. Classes and interfaces are further broken down in to logical modules based on the interfaces and interactions previously indicated. The output of this phase is a formal design document. Implementation / Coding This phase focuses primarily on implementing the previously defined modules in to units of code. These units are developed independently are intergraded as the system is put together as part of a whole system. Software Integration & Verification This phase primarily focuses on testing each of the units of code developed as well as testing the system as a whole. There are basic types of testing at this phase and they include: Unit Test and Integration Test. Unit Test are built to test the functionality of a code unit to ensure that it preforms its desired task. Integration testing test the system as a whole because it focuses on results of combining specific units of code and validating it against expected results. The output of this phase is a test plan that includes test with expected results and actual results. System Verification This phase primarily focuses on testing the system as a whole in regards to the list of project requirements and desired operating environment. Operation & Maintenance his phase primarily focuses on handing off the competed project over to the customer so that they can verify that all of their requirements have been met based on their original requirements. This phase will also validate the correctness of their requirements and if any changed need to be made. In addition, any problems not resolved in the previous phase will be handled in this section. The Waterfall Model’s linear and sequential methodology does offer a project certain advantages and disadvantages. Advantages of the Waterfall Model Simplistic to implement and execute for projects and/or company wide Limited demand on resources Large emphasis on documentation Disadvantages of the Waterfall Model Completed phases cannot be revisited regardless if issues arise within a project Accurate requirement are never gather prior to the completion of the requirement phase due to the lack of clarification in regards to client’s desires. Small changes or errors that arise in applications may cause additional problems The client cannot change any requirements once the requirements phase has been completed leaving them no options for changes as they see their requirements changes as the customers desires change. Excess documentation Phases are cumbersome and slow moving Learn more about the Major Process in the Sofware Development Life Cycle and Waterfall Model. Conversely, the Hare shares similar traits with the prototyping software development model in that ideas are rapidly converted to basic working examples and subsequent changes are made to quickly align the project with customers desires as they are formulated and as software strays from the customers vision. The basic concept of prototyping is to eliminate the use of well-defined project requirements. Projects are allowed to grow as the customer needs and request grow. Projects are initially designed according to basic requirements and are refined as requirement become more refined. This process allows customer to feel their way around the application to ensure that they are developing exactly what they want in the application This model also works well for determining the feasibility of certain approaches in regards to an application. Prototypes allow for quickly developing examples of implementing specific functionality based on certain techniques. Advantages of Prototyping Active participation from users and customers Allows customers to change their mind in specifying requirements Customers get a better understanding of the system as it is developed Earlier bug/error detection Promotes communication with customers Prototype could be used as final production Reduced time needed to develop applications compared to the Waterfall method Disadvantages of Prototyping Promotes constantly redefining project requirements that cause major system rewrites Potential for increased complexity of a system as scope of the system expands Customer could believe the prototype as the working version. Implementation compromises could increase the complexity when applying updates and or application fixes When companies trying to decide between the Waterfall model and Prototype model they need to evaluate the benefits and disadvantages for both models. Typically smaller companies or projects that have major time constraints typically head for more of a Prototype model approach because it can reduce the time needed to complete the project because there is more of a focus on building a project and less on defining requirements and scope prior to the start of a project. On the other hand, Companies with well-defined requirements and time allowed to generate proper documentation should steer towards more of a waterfall model because they are in a position to obtain clarified requirements and have to design and optimal solution prior to the start of coding on a project.

    Read the article

  • Premature-Optimization and Performance Anxiety

    - by James Michael Hare
    While writing my post analyzing the new .NET 4 ConcurrentDictionary class (here), I fell into one of the classic blunders that I myself always love to warn about.  After analyzing the differences of time between a Dictionary with locking versus the new ConcurrentDictionary class, I noted that the ConcurrentDictionary was faster with read-heavy multi-threaded operations.  Then, I made the classic blunder of thinking that because the original Dictionary with locking was faster for those write-heavy uses, it was the best choice for those types of tasks.  In short, I fell into the premature-optimization anti-pattern. Basically, the premature-optimization anti-pattern is when a developer is coding very early for a perceived (whether rightly-or-wrongly) performance gain and sacrificing good design and maintainability in the process.  At best, the performance gains are usually negligible and at worst, can either negatively impact performance, or can degrade maintainability so much that time to market suffers or the code becomes very fragile due to the complexity. Keep in mind the distinction above.  I'm not talking about valid performance decisions.  There are decisions one should make when designing and writing an application that are valid performance decisions.  Examples of this are knowing the best data structures for a given situation (Dictionary versus List, for example) and choosing performance algorithms (linear search vs. binary search).  But these in my mind are macro optimizations.  The error is not in deciding to use a better data structure or algorithm, the anti-pattern as stated above is when you attempt to over-optimize early on in such a way that it sacrifices maintainability. In my case, I was actually considering trading the safety and maintainability gains of the ConcurrentDictionary (no locking required) for a slight performance gain by using the Dictionary with locking.  This would have been a mistake as I would be trading maintainability (ConcurrentDictionary requires no locking which helps readability) and safety (ConcurrentDictionary is safe for iteration even while being modified and you don't risk the developer locking incorrectly) -- and I fell for it even when I knew to watch out for it.  I think in my case, and it may be true for others as well, a large part of it was due to the time I was trained as a developer.  I began college in in the 90s when C and C++ was king and hardware speed and memory were still relatively priceless commodities and not to be squandered.  In those days, using a long instead of a short could waste precious resources, and as such, we were taught to try to minimize space and favor performance.  This is why in many cases such early code-bases were very hard to maintain.  I don't know how many times I heard back then to avoid too many function calls because of the overhead -- and in fact just last year I heard a new hire in the company where I work declare that she didn't want to refactor a long method because of function call overhead.  Now back then, that may have been a valid concern, but with today's modern hardware even if you're calling a trivial method in an extremely tight loop (which chances are the JIT compiler would optimize anyway) the results of removing method calls to speed up performance are negligible for the great majority of applications.  Now, obviously, there are those coding applications where speed is absolutely king (for example drivers, computer games, operating systems) where such sacrifices may be made.  But I would strongly advice against such optimization because of it's cost.  Many folks that are performing an optimization think it's always a win-win.  That they're simply adding speed to the application, what could possibly be wrong with that?  What they don't realize is the cost of their choice.  For every piece of straight-forward code that you obfuscate with performance enhancements, you risk the introduction of bugs in the long term technical debt of the application.  It will become so fragile over time that maintenance will become a nightmare.  I've seen such applications in places I have worked.  There are times I've seen applications where the designer was so obsessed with performance that they even designed their own memory management system for their application to try to squeeze out every ounce of performance.  Unfortunately, the application stability often suffers as a result and it is very difficult for anyone other than the original designer to maintain. I've even seen this recently where I heard a C++ developer bemoaning that in VS2010 the iterators are about twice as slow as they used to be because Microsoft added range checking (probably as part of the 0x standard implementation).  To me this was almost a joke.  Twice as slow sounds bad, but it almost never as bad as you think -- especially if you're gaining safety.  The only time twice is really that much slower is when once was too slow to begin with.  Think about it.  2 minutes is slow as a response time because 1 minute is slow.  But if an iterator takes 1 microsecond to move one position and a new, safer iterator takes 2 microseconds, this is trivial!  The only way you'd ever really notice this would be in iterating a collection just for the sake of iterating (i.e. no other operations).  To my mind, the added safety makes the extra time worth it. Always favor safety and maintainability when you can.  I know it can be a hard habit to break, especially if you started out your career early or in a language such as C where they are very performance conscious.  But in reality, these type of micro-optimizations only end up hurting you in the long run. Remember the two laws of optimization.  I'm not sure where I first heard these, but they are so true: For beginners: Do not optimize. For experts: Do not optimize yet. This is so true.  If you're a beginner, resist the urge to optimize at all costs.  And if you are an expert, delay that decision.  As long as you have chosen the right data structures and algorithms for your task, your performance will probably be more than sufficient.  Chances are it will be network, database, or disk hits that will be your slow-down, not your code.  As they say, 98% of your code's bottleneck is in 2% of your code so premature-optimization may add maintenance and safety debt that won't have any measurable impact.  Instead, code for maintainability and safety, and then, and only then, when you find a true bottleneck, then you should go back and optimize further.

    Read the article

  • 3D picking lwjgl

    - by Wirde
    I have written some code to preform 3D picking that for some reason dosn't work entirely correct! (Im using LWJGL just so you know.) I posted this at stackoverflow at first but after researching some more in to my problem i found this neat site and tought that you guys might be more qualified to answer this question. This is how the code looks like: if(Mouse.getEventButton() == 1) { if (!Mouse.getEventButtonState()) { Camera.get().generateViewMatrix(); float screenSpaceX = ((Mouse.getX()/800f/2f)-1.0f)*Camera.get().getAspectRatio(); float screenSpaceY = 1.0f-(2*((600-Mouse.getY())/600f)); float displacementRate = (float)Math.tan(Camera.get().getFovy()/2); screenSpaceX *= displacementRate; screenSpaceY *= displacementRate; Vector4f cameraSpaceNear = new Vector4f((float) (screenSpaceX * Camera.get().getNear()), (float) (screenSpaceY * Camera.get().getNear()), (float) (-Camera.get().getNear()), 1); Vector4f cameraSpaceFar = new Vector4f((float) (screenSpaceX * Camera.get().getFar()), (float) (screenSpaceY * Camera.get().getFar()), (float) (-Camera.get().getFar()), 1); Matrix4f tmpView = new Matrix4f(); Camera.get().getViewMatrix().transpose(tmpView); Matrix4f invertedViewMatrix = (Matrix4f)tmpView.invert(); Vector4f worldSpaceNear = new Vector4f(); Matrix4f.transform(invertedViewMatrix, cameraSpaceNear, worldSpaceNear); Vector4f worldSpaceFar = new Vector4f(); Matrix4f.transform(invertedViewMatrix, cameraSpaceFar, worldSpaceFar); Vector3f rayPosition = new Vector3f(worldSpaceNear.x, worldSpaceNear.y, worldSpaceNear.z); Vector3f rayDirection = new Vector3f(worldSpaceFar.x - worldSpaceNear.x, worldSpaceFar.y - worldSpaceNear.y, worldSpaceFar.z - worldSpaceNear.z); rayDirection.normalise(); Ray clickRay = new Ray(rayPosition, rayDirection); Vector tMin = new Vector(), tMax = new Vector(), tempPoint; float largestEnteringValue, smallestExitingValue, temp, closestEnteringValue = Camera.get().getFar()+0.1f; Drawable closestDrawableHit = null; for(Drawable d : this.worldModel.getDrawableThings()) { // Calcualte AABB for each object... needs to be moved later... firstVertex = true; for(Surface surface : d.getSurfaces()) { for(Vertex v : surface.getVertices()) { worldPosition.x = (v.x+d.getPosition().x)*d.getScale().x; worldPosition.y = (v.y+d.getPosition().y)*d.getScale().y; worldPosition.z = (v.z+d.getPosition().z)*d.getScale().z; worldPosition = worldPosition.rotate(d.getRotation()); if (firstVertex) { maxX = worldPosition.x; maxY = worldPosition.y; maxZ = worldPosition.z; minX = worldPosition.x; minY = worldPosition.y; minZ = worldPosition.z; firstVertex = false; } else { if (worldPosition.x > maxX) { maxX = worldPosition.x; } if (worldPosition.x < minX) { minX = worldPosition.x; } if (worldPosition.y > maxY) { maxY = worldPosition.y; } if (worldPosition.y < minY) { minY = worldPosition.y; } if (worldPosition.z > maxZ) { maxZ = worldPosition.z; } if (worldPosition.z < minZ) { minZ = worldPosition.z; } } } } // ray/slabs intersection test... // clickRay.getOrigin().x + clickRay.getDirection().x * f = minX // clickRay.getOrigin().x - minX = -clickRay.getDirection().x * f // clickRay.getOrigin().x/-clickRay.getDirection().x - minX/-clickRay.getDirection().x = f // -clickRay.getOrigin().x/clickRay.getDirection().x + minX/clickRay.getDirection().x = f largestEnteringValue = -clickRay.getOrigin().x/clickRay.getDirection().x + minX/clickRay.getDirection().x; temp = -clickRay.getOrigin().y/clickRay.getDirection().y + minY/clickRay.getDirection().y; if(largestEnteringValue < temp) { largestEnteringValue = temp; } temp = -clickRay.getOrigin().z/clickRay.getDirection().z + minZ/clickRay.getDirection().z; if(largestEnteringValue < temp) { largestEnteringValue = temp; } smallestExitingValue = -clickRay.getOrigin().x/clickRay.getDirection().x + maxX/clickRay.getDirection().x; temp = -clickRay.getOrigin().y/clickRay.getDirection().y + maxY/clickRay.getDirection().y; if(smallestExitingValue > temp) { smallestExitingValue = temp; } temp = -clickRay.getOrigin().z/clickRay.getDirection().z + maxZ/clickRay.getDirection().z; if(smallestExitingValue < temp) { smallestExitingValue = temp; } if(largestEnteringValue > smallestExitingValue) { //System.out.println("Miss!"); } else { if (largestEnteringValue < closestEnteringValue) { closestEnteringValue = largestEnteringValue; closestDrawableHit = d; } } } if(closestDrawableHit != null) { System.out.println("Hit at: (" + clickRay.setDistance(closestEnteringValue).x + ", " + clickRay.getCurrentPosition().y + ", " + clickRay.getCurrentPosition().z); this.worldModel.removeDrawableThing(closestDrawableHit); } } } I just don't understand what's wrong, the ray are shooting and i do hit stuff that gets removed but the result of the ray are verry strange it sometimes removes the thing im clicking at, sometimes it removes things thats not even close to what im clicking at, and sometimes it removes nothing at all. Edit: Okay so i have continued searching for errors and by debugging the ray (by painting smal dots where it travles) i can now se that there is something oviously wrong with the ray that im sending out... it has its origin near the world center (nearer or further away depending on where on the screen im clicking) and always shots to the same position no matter where I direct my camera... My initial toughts is that there might be some error in the way i calculate my viewMatrix (since it's not possible to get the viewmatrix from the gluLookAt method in lwjgl; I have to build it my self and I guess thats where the problem is at)... Edit2: This is how i calculate it currently: private double[][] viewMatrixDouble = {{0,0,0,0}, {0,0,0,0}, {0,0,0,0}, {0,0,0,1}}; public Vector getCameraDirectionVector() { Vector actualEye = this.getActualEyePosition(); return new Vector(lookAt.x-actualEye.x, lookAt.y-actualEye.y, lookAt.z-actualEye.z); } public Vector getActualEyePosition() { return eye.rotate(this.getRotation()); } public void generateViewMatrix() { Vector cameraDirectionVector = getCameraDirectionVector().normalize(); Vector side = Vector.cross(cameraDirectionVector, this.upVector).normalize(); Vector up = Vector.cross(side, cameraDirectionVector); viewMatrixDouble[0][0] = side.x; viewMatrixDouble[0][1] = up.x; viewMatrixDouble[0][2] = -cameraDirectionVector.x; viewMatrixDouble[1][0] = side.y; viewMatrixDouble[1][1] = up.y; viewMatrixDouble[1][2] = -cameraDirectionVector.y; viewMatrixDouble[2][0] = side.z; viewMatrixDouble[2][1] = up.z; viewMatrixDouble[2][2] = -cameraDirectionVector.z; /* Vector actualEyePosition = this.getActualEyePosition(); Vector zaxis = new Vector(this.lookAt.x - actualEyePosition.x, this.lookAt.y - actualEyePosition.y, this.lookAt.z - actualEyePosition.z).normalize(); Vector xaxis = Vector.cross(upVector, zaxis).normalize(); Vector yaxis = Vector.cross(zaxis, xaxis); viewMatrixDouble[0][0] = xaxis.x; viewMatrixDouble[0][1] = yaxis.x; viewMatrixDouble[0][2] = zaxis.x; viewMatrixDouble[1][0] = xaxis.y; viewMatrixDouble[1][1] = yaxis.y; viewMatrixDouble[1][2] = zaxis.y; viewMatrixDouble[2][0] = xaxis.z; viewMatrixDouble[2][1] = yaxis.z; viewMatrixDouble[2][2] = zaxis.z; viewMatrixDouble[3][0] = -Vector.dot(xaxis, actualEyePosition); viewMatrixDouble[3][1] =-Vector.dot(yaxis, actualEyePosition); viewMatrixDouble[3][2] = -Vector.dot(zaxis, actualEyePosition); */ viewMatrix = new Matrix4f(); viewMatrix.load(getViewMatrixAsFloatBuffer()); } Would be verry greatfull if anyone could verify if this is wrong or right, and if it's wrong; supply me with the right way of doing it... I have read alot of threads and documentations about this but i can't seam to wrapp my head around it... Edit3: Okay with the help of Byte56 (thanks alot for the help) i have now concluded that it's not the viewMatrix that is the problem... I still get the same messedup result; anyone that think that they can find the error in my code, i certenly can't, have bean working on this for 3 days now :(

    Read the article

  • CodePlex Daily Summary for Tuesday, January 18, 2011

    CodePlex Daily Summary for Tuesday, January 18, 2011Popular ReleasesThe Open Source Phasor Data Concentrator: January 2011 openPDC v1.4 Release: Planned version of the January 2011, version 1.4 release of the openPDC. This is a functional BETA version of the January 2011 openPDC. The final release of this version will include integrated system user authentication in the openPDC Manager along with detailed configuration change logging. Update notes: Real-time data access / subscription based API available supporting full resolution as well as down-sampled data Improved UDP_T support (control channel failure monitoring independent...mytrip.mvc (CMS & e-Commerce): mytrip.mvc 1.0.52.1 beta 2: New MVC3 RTM fix bug: Dropdown select fix bug: Add Store/Department and Add Store/Produser WEB.mytrip.mvc 1.0.52.1 Web for install hosting System Requirements: NET 4.0, MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) SRC.mytrip.mvc 1.0.52.1 System Requirements: Visual Studio 2010 or Web Deweloper 2010 MSSQL 2008 or MySql (auto creation table to database) if .\SQLEXPRESS auto creation database (App_Data folder) Connector/Net...Windows 7 Werkbank: PixelShader in WPF: Dieses Beispiel demonstriert wie man Pixelshader in bestehende WPF-Anwendungen integrieren kann, um mit grafische "Spielereien" die Oberfläche aufzuwerten.QRCode Helper: ver.1.0.0: This is first release.ASP.NET MVC Project Awesome, jQuery Ajax helpers (controls): 1.6.1: A rich set of helpers (controls) that you can use to build highly responsive and interactive Ajax-enabled Web applications. These helpers include Autocomplete, AjaxDropdown, Lookup, Confirm Dialog, Popup Form, Popup and Pager changes: RenderView controller extension works for razor also live demo switched to razorBloodSim: BloodSim - 1.3.3.1: - Priority update to resolve a bug that was causing Boss damage to ignore Blood Shields entirelyRawr: Rawr 4.0.16 Beta: Rawr is now web-based. The link to use Rawr4 is: http://elitistjerks.com/rawr.phpThis is the Cataclysm Beta Release. More details can be found at the following link http://rawr.codeplex.com/Thread/View.aspx?ThreadId=237262 As of this release, you can now also begin using the new Downloadable WPF version of Rawr!This is a pre-alpha release of the WPF version, there are likely to be a lot of issues. If you have a problem, please follow the Posting Guidelines and put it into the Issue Tracker. W...MvcContrib: an Outer Curve Foundation project: MVC 3 - 3.0.51.0: Please see the Change Log for a complete list of changes. MVC BootCamp Description of the releases: MvcContrib.Release.zip MvcContrib.dll MvcContrib.TestHelper.dll MvcContrib.Extras.Release.zip T4MVC. The extra view engines / controller factories and other functionality which is in the project. This file includes the main MvcContrib assembly. Samples are included in the release. You do not need MvcContrib if you download the Extras.N2 CMS: 2.1.1: N2 is a lightweight CMS framework for ASP.NET. It helps you build great web sites that anyone can update. 2.1.1 Maintenance release List of changes 2.1 Major Changes Support for auto-implemented properties ({get;set;}, based on contribution by And Poulsen) File manager improvements (multiple file upload, resize images to fit) New image gallery Infinite scroll paging on news Content templates First time with N2? Try the demo site Download one of the template packs (above) and open...VidCoder: 0.8.1: Adds ability to choose an arbitrary range (in seconds or frames) to encode. Adds ability to override the title number in the output file name when enqueing multiple titles. Updated presets: Added iPhone 4, Apple TV 2, fixed some existing presets that should have had weightp=0 or trellis=0 on them. Added {parent} option to auto-name format. Use {parent:2} to refer to a folder 2 levels above the input file. Added {title:2} option to auto-name format. Adds leading zeroes to reach the sp...Microsoft SQL Server Product Samples: Database: AdventureWorks2008R2 without filestream: This download contains a version of the AdventureWorks2008R2 OLTP database without FILESTREAM properties. You do not need to have filestream enabled to attach this database. No additional schema or data changes have been made. To install the version of AdventureWorks2008R2 that includes filestream, use the SR1 installer available here. Prerequisites: Microsoft SQL Server 2008 R2 must be installed. Full-Text Search must be enabled. Installing the AdventureWorks2008R2 OLTP database: 1. Cl...NuGet: NuGet 1.0 RTM: NuGet is a free, open source developer focused package management system for the .NET platform intent on simplifying the process of incorporating third party libraries into a .NET application during development. This release is a Visual Studio 2010 extension and contains the the Package Manager Console and the Add Package Dialog.MVC Music Store: MVC Music Store v2.0: This is the 2.0 release of the MVC Music Store Tutorial. This tutorial is updated for ASP.NET MVC 3 and Entity Framework Code-First, and contains fixes and improvements based on feedback and common questions from previous releases. The main download, MvcMusicStore-v2.0.zip, contains everything you need to build the sample application, including A detailed tutorial document in PDF format Assets you will need to build the project, including images, a stylesheet, and a pre-populated databas...Free Silverlight & WPF Chart Control - Visifire: Visifire SL and WPF Charts v3.6.7 GA Released: Hi, Today we are releasing Visifire 3.6.7 GA with the following feature: * Inlines property has been implemented in Title. Also, this release contains fix for the following bugs: * In Column and Bar chart DataPoint’s label properties were not working as expected at real-time if marker enabled was set to true. * 3D Column and Bar chart were not rendered properly if AxisMinimum property was set in x-axis. You can download Visifire v3.6.7 here. Cheers, Team Visifire??????????: All-In-One Code Framework ??? 2011-01-12: 2011???????All-In-One Code Framework(??) 2011?1??????!!http://i3.codeplex.com/Project/Download/FileDownload.aspx?ProjectName=1code&DownloadId=128165 ?????release?,???????ASP.NET, AJAX, WinForm, Windows Shell????13?Sample Code。???,??????????sample code。 ?????:http://blog.csdn.net/sjb5201/archive/2011/01/13/6135037.aspx ??,??????MSDN????????????。 http://social.msdn.microsoft.com/Forums/zh-CN/codezhchs/threads ?????????????????,??Email ????patterns & practices – Enterprise Library: Enterprise Library 5.0 - Extensibility Labs: This is a preview release of the Hands-on Labs to help you learn and practice different ways the Enterprise Library can be extended. Learning MapCustom exception handler (estimated time to complete: 1 hr 15 mins) Custom logging trace listener (1 hr) Custom configuration source (registry-based) (30 mins) System requirementsEnterprise Library 5.0 / Unity 2.0 installed SQL Express 2008 installed Visual Studio 2010 Pro (or better) installed AuthorsChris Tavares, Microsoft Corporation ...Orchard Project: Orchard 1.0: Orchard Release Notes Build: 1.0.20 Published: 1/12/2010 How to Install OrchardTo install Orchard using Web PI, follow these instructions: http://www.orchardproject.net/docs/Installing-Orchard.ashx Web PI will detect your hardware environment and install the application. --OR-- Alternatively, to install the release manually, download the Orchard.Web.1.0.20.zip file. http://orchardproject.net/docs/Manually-installing-Orchard-zip-file.ashx The zip contents are pre-built and ready-to-run...Umbraco CMS: Umbraco 4.6.1: The Umbraco 4.6.1 (codename JUNO) release contains many new features focusing on an improved installation experience, a number of robust developer features, and contains nearly 200 bug fixes since the 4.5.2 release. Getting Started A great place to start is with our Getting Started Guide: Getting Started Guide: http://umbraco.codeplex.com/Project/Download/FileDownload.aspx?DownloadId=197051 Make sure to check the free foundation videos on how to get started building Umbraco sites. They're ...StyleCop for ReSharper: StyleCop for ReSharper 5.1.14986.000: A considerable amount of work has gone into this release: Features: Huge focus on performance around the violation scanning subsystem: - caching added to reduce IO operations around reading and merging of settings files - caching added to reduce creation of expensive objects Users should notice condsiderable perf boost and a decrease in memory usage. Bug Fixes: - StyleCop's new ObjectBasedEnvironment object does not resolve the StyleCop installation path, thus it does not return the ...Facebook C# SDK: 4.2.1: - Authentication bug fixes - Updated Json.Net to version 4.0.0 - BREAKING CHANGE: Removed cookieSupport config setting, now automatic. This download is also availible on NuGet: Facebook FacebookWeb FacebookWebMvcNew ProjectsAmazon Clone MVC: Amazon CloneBogglex: Bogglex is a single player Boggle game written using C# and WPF.ClomibepASP: ClomibepASPCommandLineHelp: CommandLineHelp is a framework for simplifying the automated execution of command-line programs and saving their output.DistriLog: This is set of libraries that allow you to handle distributed logging. This is aimed at applications that are installed on multiple machines and instead of having a central log server(that may slow down the application due to network latency), a local log is created. A synchronization process then unifies these logs into a central SQL database. Local database is SQL Server 2005 Compact Edition, the library is in VB and the central database is SQL Server 2005enmeshed: A set of technology trials for efficient network streaming/transfer.Hexing Colors for Windows Phone 7: Hexing Colors is a simple Silverlight game for Windows Phone 7 based on the web game "What the Hex" written by Andrew Yang and created for educational purposes. The code of the app is here published for anyone to download and analyze it to learn the basic internals of a WP7 app.NetChannels: NetChannels is a library to provide an asynchronous event-driven network application framework for the rapid development of maintainable high-performance high-scalability protocol servers. It is based on the architecture of the netty project for C#.NMEA Sentence Parser: The NMEA Parser is a lightweight library used to parse NMEA sentences into geocodes which can be used in geoservice applications. The project is written in C#, using Visual Studio 2010.NUnit Windows Phone 7: Project to run NUnit Tests on Windows Phone 7 with a list of results shown and drill down detail view.Pratiques: Endroit pour gérer les Pratiques.Project-Cena2: Project-Cena2ReportEngine: The is report platform, it' can be extend to export reportSharepoint DeepZoom Search: This project demonstrates using A Silverlight DeepZoom app to query the SharePoint search api and show those results as deep zoom tiles. This project is based upon or uses components from the Eventr and SuperDeepZoom projects.SilverDesktop: SDSixport: Sixport is the C# port of the hexter DSSI software synthesizer plugin created by Sean Bolton and others. hexter is an open source emulation of the legendary Yamaha DX7 synthesizer. Changes done: OOP structure, algorithm specific rendering, LADSPA removal, speed improvements.Smug: Is your time writing code too valuable to spend writing tests? Are you too good for test code; too smug? Smug is a Studs and Mocks Uber Generator; a factory for creating proxy objects to simplify testing.sptest: one of the projectSQL Azure Demos: Home for Microsoft SQL Azure screencasts and demo applications.StaffPenalties: Staff Penalties... simple silverlight appTFS Global Alerts: A web service to notify any number of users when any work item in TFS changes. Notification logic is easily customisable to suit your environment. XNA SfxrSynth: Using settings from as3sfxr, SfxrSynth generates audio in the form of XNA SoundEffects for using in Windows or Xbox 360 games.???: ???????。

    Read the article

  • The Changing Face of PASS

    - by Bill Graziano
    I’m starting my sixth year on the PASS Board.  I served two years as the Program Director, two years as the Vice-President of Marketing and I’m starting my second year as the Executive Vice-President of Finance.  There’s a pretty good chance that if PASS has done something you don’t like or is doing something you don’t like, that I’m involved in one way or another. Andy Leonard asked in a comment on his blog if the Board had ever reversed itself based on community input.  He asserted that it hadn’t.  I disagree.  I’m not going to try and list all the changes we make inside portfolios based on feedback from and meetings with the community.  I’m going to focus on major governance issues since I was elected to the Board. Management Company The first big change was our management company.  Our old management company had a standard approach to running a non-profit.  It worked well when PASS was launched.  Having a ready-made structure and process to run the organization enabled the organization to grow quickly.  As time went on we were limited in some of the things we wanted to do.  The more involved you were with PASS, the more you saw these limitations.  Key volunteers were regularly providing feedback that they wanted certain changes that were difficult for us to accomplish.  The Board at that time wanted changes that were difficult or impossible to accomplish under that structure. This was not a simple change.  Imagine a $2.5 million dollar company letting all its employees go on a Friday and starting with a new staff on Monday.  We also had a very narrow window to accomplish that so that we wouldn’t affect the Summit – our only source of revenue.  We spent the year after the change rebuilding processes and putting on the Summit in Denver.  That’s a concrete example of a huge change that PASS made to better serve its members.  And it was a change that many in the community were telling us we needed to make. Financials We heard regularly from our members that they wanted our financials posted.  Today on our web site you can find audited financials going back to 2004.  We publish our budget at the start of each year.  If you ask a question about the financials on the PASS site I do my best to answer it.  I’m also trying to do a better job answering financial questions posted in other locations.  (And yes, I know I owe a few of you some blog posts.) That’s another concrete example of a change that our members asked for that the Board agreed was a good decision. Minutes When I started on the Board the meeting minutes were very limited.  The minutes from a two day Board meeting might fit on one page.  I think we did the bare minimum we were legally required to do.  Today Board meeting minutes run from 5 to 12 pages and go into incredible detail on what we talk about.  There are certain topics that are NDA but where possible we try to list the topic we discussed but that the actual discussion was under NDA.  We also publish the agenda of Board meetings ahead of time. This is another specific example where input from the community influenced the decision.  It was certainly easier to have limited minutes but I think the extra effort helps our members understand what’s going on. Board Q&A At the 2009 Summit the Board held its first public Q&A with our members.  We’d always been available individually to answer questions.  There’s a benefit to getting us all in one room and asking the really hard questions to watch us squirm.  We learn what questions we don’t have good answers for.  We get to see how many people in the crowd look interested in the various questions and answers. I don’t recall the genesis of how this came about.  I’m fairly certain there was some community pressure though. Board Votes Until last November, the Board only reported the vote totals and not how individual Board members voted.  That was one of the topics at a great lunch I had with Tim Mitchell and Kendal van Dyke at the Summit.  That was also the topic of the first question asked at the Board Q&A by Kendal.  Kendal expressed his opposition to to anonymous votes clearly and passionately and without trying to paint anyone into a corner.  Less than 24 hours later the PASS Board voted to make individual votes public unless the topic was under NDA.  That’s another area where the Board decided to change based on feedback from our members. Summit Location While this isn’t actually a governance issue it is one of the more public decisions we make that has taken some public criticism.  There is a significant portion of our members that want the Summit near them.  There is a significant portion of our members that like the Summit in Seattle.  There is a significant portion of our members that think it should move around the country.  I was one that felt strongly that there were significant, tangible benefits to our attendees to being in Seattle every year.  I’m also one that has been swayed by some very compelling arguments that we need to have at least one outside Seattle and then revisit the decision.  I can’t tell you how the Board will vote but I know the opinion of our members weighs heavily on the decision. Elections And that brings us to the grand-daddy of all governance issues.  My thesis for this blog post is that the PASS Board has implemented policy changes in response to member feedback.  It isn’t to defend or criticize our election process.  It’s just to say that is has been under going continuous change since I’ve been on the Board.  I ran for the Board in the fall of 2005.  I don’t know much about what happened before then.  I was actively volunteering for PASS for four years prior to that as a chapter leader and on the program committee.  I don’t recall any complaints about elections but that doesn’t mean they didn’t occur.  The questions from the Nominating Committee (NomCom) were trivial and the selection process rudimentary (For example, “Tell us about your accomplishments”).  I don’t even remember who I ran against or how many other people ran.  I ran for the VP of Marketing in the fall of 2007.  I don’t recall any significant changes the Board made in the election process for that election.  I think a lot of the changes in 2007 came from us asking the management company to work on the election process.  I was expecting a similar set of puff ball questions from my previous election.  Boy, was I in for a shock.  The NomCom had found a much better set of questions and really made the interview portion difficult.  The questions were much more behavioral in nature.  I’d already written about my vision for PASS and my goals.  They wanted to know how I handled adversity, how I handled criticism, how I handled conflict, how I handled troublesome volunteers, how I motivated people and how I responded to motivation. And many, many other things. They grilled me for over an hour.  I’ve done a fair bit of technical sales in my time.  I feel I speak well under pressure addressing pointed questions.  This interview intentionally put me under pressure.  In addition to wanting to know about my interpersonal skills, my work experience, my volunteer experience and my supervisory experience they wanted to see how I’d do under pressure.  They wanted to see who would respond under pressure and who wouldn’t.  It was a bit of a shock. That was the first big change I remember in the election process.  I know there were other improvements around the process but none of them stick in my mind quite like the unexpected hour-long grilling. The next big change I remember was after the 2009 elections.  Andy Warren was unhappy with the election process and wanted to make some changes.  He worked with Hannes at HQ and they came up with a better set of processes.  I think Andy moved PASS in the right direction.  Nonetheless, after the 2010 election even more people were very publicly clamoring for changes to our election process.  In August of 2010 we had a choice to make.  There were numerous bloggers criticizing the Board and our upcoming election.  The easy change would be to announce that we were changing the process in a way that would satisfy our critics.  I believe that a knee-jerk response to criticism is seldom correct. Instead the Board spent August and September and October and November listening to the community.  I visited two SQLSaturdays and asked questions of everyone I could.  I attended chapter meetings and asked questions of as many people as they’d let me.  At Summit I made it a point to introduce myself to strangers and ask them about the election.  At every breakfast I’d sit down at a table full of strangers and ask about the election.  I’m happy to say that I left most tables arguing about the election.  Most days I managed to get 2 or 3 breakfasts in. I spent less time talking to people that had already written about the election.  They were already expressing their opinion.  I wanted to talk to people that hadn’t spoken up.  I wanted to know what the silent majority thought.  The Board all attended the Q&A session where our members expressed their concerns about a variety of issues including the election. The PASS Board also chose to create the Election Review Committee.  We wanted people from the community that had been involved with PASS to look at our election process with fresh eyes while listening to what the community had to say and give us some advice on how we could improve the process.  I’m a part of this as is Andy Warren.  None of the other members are on the Board.  I’ve sat in numerous calls and interviews with this group and attended an open meeting at the Summit.  We asked anyone that wanted to discuss the election to come speak with us.  The ERC held an open meeting at the Summit and invited anyone to attend.  There are forums on the ERC web site where we’ve invited people to participate.  The ERC has reached to key people involved in recent elections.  The years that I haven’t mentioned also saw minor improvements in the election process.  Off the top of my head I don’t recall what exact changes were made each year.  Specifically since the 2010 election we’ve gone out of our way to seek input from the community about the process.  I’m not sure what more we could have done to invite feedback from the community. I think to say that we haven’t “fixed” the election process isn’t a fair criticism at this time.  We haven’t rushed any changes through the process.  If you don’t see any changes in our election process in July or August then I think it’s fair to criticize us for ignoring the community or ask for an explanation for what we’ve done. In Summary Andy’s main point was that the PASS Board hasn’t changed in response to our members wishes.  I think I’ve shown that time and time again the PASS Board has changed in response to what our members want.  There are only two outstanding issues: Summit location and elections.  The 2013 Summit location hasn’t been decided yet.  Our work on the elections is also in progress.  And at every step in the election review we’ve gone out of our way to listen to the community and incorporate their feedback on the process. I also hope I’m not encouraging everyone that wants some change in the organization to organize a “blog rush” against the Board.  We take public suggestions very seriously but we also take the time to evaluate those suggestions and learn what the rest of our members think and make a measured decision.

    Read the article

  • Oracle Tutor: Top 10 to Implement Sustainable Policies and Procedures

    - by emily.chorba(at)oracle.com
    Overview Your organization (executives, managers, and employees) understands the value of having written business process documents (process maps, procedures, instructions, reference documents, and form abstracts). Policies and procedures should be documented because they help to reduce the range of individual decisions and encourage management by exception: the manager only needs to give special attention to unusual problems, not covered by a specific policy or procedure. As more and more procedures are written to cover recurring situations, managers will begin to make decisions which will be consistent from one functional area to the next.Companies should take a project management approach when implementing an environment for a sustainable documentation program and do the following:1. Identify an Executive Champion2. Put together a winning team3. Assign ownership4. Centralize publishing5. Establish the Document Maintenance Process Up Front6. Document critical activities only7. Document actual practice8. Minimize documentation9. Support continuous improvement10. Keep it simple 1. Identify an Executive ChampionAppoint a top down driver. Select one key individual to be a mentor for the procedure planning team. The individual should be a senior manager, such as your company president, CIO, CFO, the vice-president of quality, manufacturing, or engineering. Written policies and procedures can be important supportive aids when known to express the thinking for the chief executive officer and / or the president and to have his or her full support. 2. Put Together a Winning TeamChoose a strong Project Management Leader and staff the procedure planning team with management members from cross functional groups. Make sure team members have the responsibility - and the authority - to make things happen.The winning team should consist of the Documentation Project Manager, Document Owners (one for each functional area), a Document Controller, and Document Specialists (as needed). The Tutor Implementation Guide has complete job descriptions for these roles. 3. Assign Ownership It is virtually impossible to keep process documentation simple and meaningful if employees who are far removed from the activity itself create it. It is impossible to keep documentation up-to-date when responsibility for the document is not clearly understood.Key to the Tutor methodology, therefore, is the concept of ownership. Each document has a single owner, who is responsible for ensuring that the document is necessary and that it reflects actual practice. The owner must be a person who is knowledgeable about the activity and who has the authority to build consensus among the persons who participate in the activity as well as the authority to define or change the way an activity is performed. The owner must be an advocate of the performers and negotiate, not dictate practices.In the Tutor environment, a document's owner is the only person with the authority to approve an update to that document. 4. Centralize Publishing Although it is tempting (especially in a networked environment and with document management software solutions) to decentralize the control of all documents -- with each owner updating and distributing his own -- Tutor promotes centralized publishing by assigning the Document Administrator (gate keeper) to manage the updates and distribution of the procedures library. 5. Establish a Document Maintenance Process Up Front (and stick to it) Everyone in your organization should know they are invited to suggest changes to procedures and should understand exactly what steps to take to do so. Tutor provides a set of procedures to help your company set up a healthy document control system. There are many document management products available to automate some of the document change and maintenance steps. Depending on the size of your organization, a simple document management system can reduce the effort it takes to track and distribute document changes and updates. Whether your company decides to store the written policies and procedures on a file server or in a database, the essential tasks for maintaining documents are the same, though some tasks are automated. 6. Document Critical Activities Only The best way to keep your documentation simple is to reduce the number of process documents to a bare minimum and to include in those documents only as much detail as is absolutely necessary. The first step to reducing process documentation is to document only those activities that are deemed critical. Not all activities require documentation. In fact, some critical activities cannot and should not be standardized. Others may be sufficiently documented with an instruction or a checklist and may not require a procedure. A document should only be created when it enhances the performance of the employee performing the activity. If it does not help the employee, then there is no reason to maintain the document. Activities that represent little risk (such as project status), activities that cannot be defined in terms of specific tasks (such as product research), and activities that can be performed in a variety of ways (such as advertising) often do not require documentation. Sometimes, an activity will evolve to the point where documentation is necessary. For example, an activity performed by single employee may be straightforward and uncomplicated -- that is, until the activity is performed by multiple employees. Sometimes, it is the interaction between co-workers that necessitates documentation; sometimes, it is the complexity or the diversity of the activity.7. Document Actual Practices The only reason to maintain process documentation is to enhance the performance of the employee performing the activity. And documentation can only enhance performance if it reflects reality -- that is, current best practice. Documentation that reflects an unattainable ideal or outdated practices will end up on the shelf, unused and forgotten.Documenting actual practice means (1) auditing the activity to understand how the work is really performed, (2) identifying best practices with employees who are involved in the activity, (3) building consensus so that everyone agrees on a common method, and (4) recording that consensus.8. Minimize Documentation One way to keep it simple is to document at the highest level possible. That is, include in your documents only as much detail as is absolutely necessary.When writing a document, you should ask yourself, What is the purpose of this document? That is, what problem will it solve?By focusing on this question, you can target the critical information.• What questions are the end users likely to have?• What level of detail is required?• Is any of this information extraneous to the document's purpose? Short, concise documents are user friendly and they are easier to keep up to date. 9. Support Continuous Improvement Employees who perform an activity are often in the best position to identify improvements to the process. In other words, continuous improvement is a natural byproduct of the work itself -- but only if the improvements are communicated to all employees who are involved in the process, and only if there is consensus among those employees.Traditionally, process documentation has been used to dictate performance, to limit employees' actions. In the Tutor environment, process documents are used to communicate improvements identified by employees. How does this work? The Tutor methodology requires a process document to reflect actual practice, so the owner of a document must routinely audit its content -- does the document match what the employees are doing? If it doesn't, the owner has the responsibility to evaluate the process, to build consensus among the employees, to identify "best practices," and to communicate these improvements via a document update. Continuous improvement can also be an outgrowth of corrective action -- but only if the solutions to problems are communicated effectively. The goal should be to solve a problem once and only once, which means not only identifying the solution, but ensuring that the solution becomes part of the process. The Tutor system provides the method through which improvements and solutions are documented and communicated to all affected employees in a cost-effective, timely manner; it ensures that improvements are not lost or confined to a single employee. 10. Keep it Simple Process documents don't have to be complex and unfriendly. In fact, the simpler the format and organization, the more likely the documents will be used. And the simpler the method of maintenance, the more likely the documents will be kept up-to-date. Keep it simply by:• Minimizing skills and training required• Following the established Tutor document format and layout• Avoiding technology just for technology's sake No other rule has as major an impact on the success of your internal documentation as -- keep it simple. Learn More For more information about Tutor, visit Oracle.Com or the Tutor Blog. Post your questions at the Tutor Forum.   Emily Chorba Principle Product Manager Oracle Tutor & BPM 

    Read the article

  • CodePlex Daily Summary for Wednesday, May 16, 2012

    CodePlex Daily Summary for Wednesday, May 16, 2012Popular ReleasesWatchersNET.UrlShorty: WatchersNET.UrlShorty 01.03.03: changes Fixed Url & Error History when urls contain line breaksMetadata Document Generator for Microsoft Dynamics CRM 2011: Metadata Document Generator (1.0.320.91): NEW FEATURES Use of GemBox Software Document assembly to generate Word document Thanks to them! http://www.gemboxsoftware.com/ Word document is now using headings for better navigation BUG FIXES Export fails if two entities have same display name Export fails if entity has more than one main form Clicking multiple time on "Check all" button duplicate attributes list Not working if attributes are Selected Unable to create connection on Partner VM Does not generate document in other than main ...AspxCommerce: AspxCommerce1.1: AspxCommerce - 'Flexible and easy eCommerce platform' offers a complete e-Commerce solution that allows you to build and run your fully functional online store in minutes. You can create your storefront; manage the products through categories and subcategories, accept payments through credit cards and ship the ordered products to the customers. We have everything set up for you, so that you can only focus on building your own online store. Note: To login as a superuser, the username and pass...SiteMap Editor for Microsoft Dynamics CRM 2011: SiteMap Editor (1.1.1616.403): BUG FIX Hide save button when Titles or Descriptions element is selectedVisual C++ 2010 Directories Editor: Visual C++ 2010 Directories Editor (x32_x64): release v1.3MapWindow 6 Desktop GIS: MapWindow 6.1.2: Looking for a .Net GIS Map Application?MapWindow 6 Desktop GIS is an open source desktop GIS for Microsoft Windows that is built upon the DotSpatial Library. This release requires .Net 4 (Client Profile). Are you a software developer?Instead of downloading MapWindow for development purposes, get started with with the DotSpatial template. The extensions you create from the template can be loaded in MapWindow.DotSpatial: DotSpatial 1.2: This is a Minor Release. See the changes in the issue tracker. Minimal -- includes DotSpatial core and essential extensions Extended -- includes debugging symbols and additional extensions Tutorials are available. Just want to run the software? End user (non-programmer) version available branded as MapWindow Want to add your own feature? Develop a plugin, using the template and contribute to the extension feed (you can also write extensions that you distribute in other ways). Components ...Mugen Injection: Mugen Injection 2.2.1 (WinRT supported): Added ManagedScopeLifecycle. Increase performance. Added support for resolve 'params'.Microsoft Ajax Minifier: Microsoft Ajax Minifier 4.52: Make preprocessor comment-statements nestable; add the ///#IFNDEF statement. (Discussion #355785) Don't throw an error for old-school JScript event handlers, and don't rename them if they aren't global functions.DotNetNuke® Events: 06.00.00: This is a serious release of Events. DNN 6 form pattern - We have take the full route towards DNN6: most notably the incorporation of the DNN6 form pattern with streamlined UX/UI. We have also tried to change all formatting to a div based structure. A daunting task, since the Events module contains a lot of forms. Roger has done a splendid job by going through all the forms in great detail, replacing all table style layouts into the new DNN6 div class="dnnForm XXX" type of layout with chang...LogicCircuit: LogicCircuit 2.12.5.15: Logic Circuit - is educational software for designing and simulating logic circuits. Intuitive graphical user interface, allows you to create unrestricted circuit hierarchy with multi bit buses, debug circuits behavior with oscilloscope, and navigate running circuits hierarchy. Changes of this versionThis release is fixing one but nasty bug. Two functions XOR and XNOR when used with 3 or more inputs were incorrectly evaluating their results. If you have a circuit that is using these functions...GAC Explorer: GACExplorer_x86_Setup: Version 1.0 Features -> Copy assembly(s) to clipboard. -> Copy assembly(s) to local folder. -> Open assembly(s) folder location. -> Support Shortcut keysBlogEngine.NET: BlogEngine.NET 2.6: Get DotNetBlogEngine for 3 Months Free! Click Here for More Info BlogEngine.NET Hosting - 3 months free! Cheap ASP.NET Hosting - $4.95/Month - Click Here!! Click Here for More Info Cheap ASP.NET Hosting - $4.95/Month - Click Here! If you want to set up and start using BlogEngine.NET right away, you should download the Web project. If you want to extend or modify BlogEngine.NET, you should download the source code. If you are upgrading from a previous version of BlogEngine.NET, please take...BlackJumboDog: Ver5.6.2: 2012.05.07 Ver5.6.2 (1) Web???????、????????·????????? (2) Web???????、?????????? COMSPEC PATHEXT WINDIR SERVERADDR SERVERPORT DOCUMENTROOT SERVERADMIN REMOTE_PORT HTTPACCEPTCHRSET HTTPACCEPTLANGUAGE HTTPACCEPTEXCODINGGardens Point Parser Generator: Gardens Point Parser Generator version 1.5.0: ChangesVersion 1.5.0 contains a number of changes. Error messages are now MSBuild and VS-friendly. The default encoding of the *.y file is Unicode, with an automatic fallback to the previous raw-byte interpretation. The /report option has been improved, as has the automaton tracing facility. New facilities are included that allow multiple parsers to share a common token type. A complete change-log is available as a separate documentation file. The source project has been upgraded to Visual...Media Companion: Media Companion 3.502b: It has been a slow week, but this release addresses a couple of recent bugs: Movies Multi-part Movies - Existing .nfo files that differed in name from the first part, were missed and scraped again. Trailers - MC attempted to scrape info for existing trailers. TV Shows Show Scraping - shows available only in the non-default language would not show up in the main browser. The correct language can now be selected using the TV Show Selector for a single show. General Will no longer prompt for ...NewLife XCode ??????: XCode v8.5.2012.0508、XCoder v4.7.2012.0320: X????: 1,????For .Net 4.0?? XCoder????: 1,???????,????X????,?????? XCode????: 1,Insert/Update/Delete???????????????,???SQL???? 2,IEntityOperate?????? 3,????????IEntityTree 4,????????????????? 5,?????????? 6,??????????????Google Book Downloader: Google Books Downloader Lite 1.0: Google Books Downloader Lite 1.0Python Tools for Visual Studio: 1.5 Alpha: We’re pleased to announce the release of Python Tools for Visual Studio 1.5 Alpha. Python Tools for Visual Studio (PTVS) is an open-source plug-in for Visual Studio which supports programming with the Python language. PTVS supports a broad range of features including: • Supports Cpython, IronPython, Jython and Pypy • Python editor with advanced member, signature intellisense and refactoring • Code navigation: “Find all refs”, goto definition, and object browser • Local and remote debugging...AD Gallery: AD Gallery 1.2.7: NewsFixed a bug which caused the current thumbnail not to be highlighted Added a hook to take complete control over how descriptions are handled, take a look under Documentation for more info Added removeAllImages()New ProjectsAccountingTest: just to learn asp.net mvc 3 Actucal: With Actucal you can make a calculation model. The model is compiled so you can add it to your projects. It has been developed in C# Also look at mine website (http://www.tronsoft.nl) for more information.AppFabric Caching Administration Services: This project is based on the idea of the current project : AppFabric Caching Admin Tool. This project is under development, a further description will be provide later.ASP.NET Web Chat (Prototype): A prototype web chat client application.bcipad: ipad appCM.Core.Library.dll: CM.Core.Library est une bibliothèque programmé en C# qui contient de nombreuses fonctions basique tel que de la cryptographie, manipulation de chaine de caractères, gestion de fichiers...Coding4Fun Boxing Bots: coding4fun boxing robotselblogdeDynamicsCRM.com!: Bienvenidos a nuestro sitio en CodePlex! Aquí compartiremos código de la comunidad en español de Dynamics CRM. www.elblogdedynamicscrm.com Twitter: http://twitter.com/elblogdeDynCRM Linkedin: http://www.linkedin.com/groups?gid=4258078ExercicioAula2: Projeto de exercício da aula de dotnet.FileListBuilder: Fabrique une liste de nom de fichier en parcourant une arborescence d'un disque. La liste peut être produite en format XML ou XLS. Dans cette liste chaque fichier est documenté, et un lien hypertext est associé. Francis Nino Seisei: This application is a template based code snippet generator for data access applications. It allows full control of the software project because it doesn't generate entire projects, only snippets that save time spent in repetitive tasks. For the next version it will be available in english. Esta aplicación es un generador de fragmentos de código basado en plantillas para aplicaciones de acceso a datos. Fue desarrollado en 15 días por lo cual puede tener uno que otro problema de estabilidad...FsAdo: FsAdo, sounds like falsetto :-), is an F# ADO.NET implementation of database factory and as a simple foundation for data access layer.GroupShop: group shop01321231HU01 Decompression: Hotmail's DeltaSync protocol uses custom compression scheme known as hm-compression or HU01. It is very similar to DEFLATE encoding, which is standardized as RFC 1951, but the bit stream format is different. Earlier work on decompressing these streams was published by Daniel Parnell, but it was merely a disassembled code rewritten to C and it didn't offer insight into the actual format of the data. This code is a clean room implementation of a decoder for HU01 streams.jQuery Media Player: A set of scripts used to play media from the webModel Generator Helper: Model Generator Helper is a small program that will allow you to easily create view models from your logic models by copying the logic model's properties and the properties' attributes. The original purpose of the program was to copy the validation annotations to my view models.muathenhanh: hello tat ca moi nguoi nheMyCommerce: Proyecto aun no terminadonczcomlibary: control library for silverlight 5 OHG.NET: OHG.NET ASP.NET Web Pages (Razor) Based Content Management System Rich Theme Support One-Click JQuery Plugins (Slider, LightBox,Paging...) Web Site Administrator Panel with Twitter Bootstrap Interface Light-weight and high perfomance with ASP.NET Web Pages Razor PgcDemuxCLI: A modified version of PgcDemux (http://download.videohelp.com/jsoto/dvdtools.htm) that has better CLI support and progress reporting.PowerShell Classroom Setup Tools for MS Learning Hyper-V machines: This scripts are intended for MCT! If you set up a MS Learning Classroom with Hyper-V and the images provided by MS at the MCT download page you need to deploy the VHD files (and the other VM files of course) to “C:\Programme\Microsoft Learning\<…>” The 4 scripts in this package will help you to set up your classroom environment at another file system location. Please read my blog article for details: ProjetoTeste: Projeto de testeReisebüro Office: dfgsfgSandbox OS: An all new COSMOS Based OS Built by the Marblestech Team.SSOrbit: Single Sign-On module that store user into a stack.Steve Blog: ???????????,?????SuchMich: Such Mich! is a game based on the famous game "Memory" (also known as "Find the pairs")Tanszeki terheleseloszto rendszer: A Tanszeki terheleseloszto rendszer egy olyan rendszer, amelynek segítségével egy tanszék feladatait lehet az oktatói között elosztani, úgy, hogy közben figyelembe vesszük az oktatók szakmai és személyes preferenciáit, és az órarendi adottságokat.TestApps: an epic testTRAVLR: Travlr Web App for collaborative trip management.WatTMDb Library: .NET Library for use with the new Version 3 API available from The Movie Db.WPF Fundamentals: WPF Fundamentals is a toolset with basic classes to use in any common WPF project. It includes: - support of the Model - ViewModel - Presenter pattern - extension for the Command Interface - common converter types (e.g. a BooleanConverter) - handy user controls (e.g. a PopupButton)

    Read the article

  • Benchmarking MySQL Replication with Multi-Threaded Slaves

    - by Mat Keep
    0 0 1 1145 6530 Homework 54 15 7660 14.0 Normal 0 false false false EN-US JA X-NONE /* Style Definitions */ table.MsoNormalTable {mso-style-name:"Table Normal"; mso-tstyle-rowband-size:0; mso-tstyle-colband-size:0; mso-style-noshow:yes; mso-style-priority:99; mso-style-parent:""; mso-padding-alt:0cm 5.4pt 0cm 5.4pt; mso-para-margin:0cm; mso-para-margin-bottom:.0001pt; mso-pagination:widow-orphan; font-size:12.0pt; font-family:Cambria; mso-ascii-font-family:Cambria; mso-ascii-theme-font:minor-latin; mso-hansi-font-family:Cambria; mso-hansi-theme-font:minor-latin; mso-ansi-language:EN-US;} The objective of this benchmark is to measure the performance improvement achieved when enabling the Multi-Threaded Slave enhancement delivered as a part MySQL 5.6. As the results demonstrate, Multi-Threaded Slaves delivers 5x higher replication performance based on a configuration with 10 databases/schemas. For real-world deployments, higher replication performance directly translates to: · Improved consistency of reads from slaves (i.e. reduced risk of reading "stale" data) · Reduced risk of data loss should the master fail before replicating all events in its binary log (binlog) The multi-threaded slave splits processing between worker threads based on schema, allowing updates to be applied in parallel, rather than sequentially. This delivers benefits to those workloads that isolate application data using databases - e.g. multi-tenant systems deployed in cloud environments. Multi-Threaded Slaves are just one of many enhancements to replication previewed as part of the MySQL 5.6 Development Release, which include: · Global Transaction Identifiers coupled with MySQL utilities for automatic failover / switchover and slave promotion · Crash Safe Slaves and Binlog · Optimized Row Based Replication · Replication Event Checksums · Time Delayed Replication These and many more are discussed in the “MySQL 5.6 Replication: Enabling the Next Generation of Web & Cloud Services” Developer Zone article  Back to the benchmark - details are as follows. Environment The test environment consisted of two Linux servers: · one running the replication master · one running the replication slave. Only the slave was involved in the actual measurements, and was based on the following configuration: - Hardware: Oracle Sun Fire X4170 M2 Server - CPU: 2 sockets, 6 cores with hyper-threading, 2930 MHz. - OS: 64-bit Oracle Enterprise Linux 6.1 - Memory: 48 GB Test Procedure Initial Setup: Two MySQL servers were started on two different hosts, configured as replication master and slave. 10 sysbench schemas were created, each with a single table: CREATE TABLE `sbtest` (    `id` int(10) unsigned NOT NULL AUTO_INCREMENT,    `k` int(10) unsigned NOT NULL DEFAULT '0',    `c` char(120) NOT NULL DEFAULT '',    `pad` char(60) NOT NULL DEFAULT '',    PRIMARY KEY (`id`),    KEY `k` (`k`) ) ENGINE=InnoDB DEFAULT CHARSET=latin1 10,000 rows were inserted in each of the 10 tables, for a total of 100,000 rows. When the inserts had replicated to the slave, the slave threads were stopped. The slave data directory was copied to a backup location and the slave threads position in the master binlog noted. 10 sysbench clients, each configured with 10 threads, were spawned at the same time to generate a random schema load against each of the 10 schemas on the master. Each sysbench client executed 10,000 "update key" statements: UPDATE sbtest set k=k+1 WHERE id = <random row> In total, this generated 100,000 update statements to later replicate during the test itself. Test Methodology: The number of slave workers to test with was configured using: SET GLOBAL slave_parallel_workers=<workers> Then the slave IO thread was started and the test waited for all the update queries to be copied over to the relay log on the slave. The benchmark clock was started and then the slave SQL thread was started. The test waited for the slave SQL thread to finish executing the 100k update queries, doing "select master_pos_wait()". When master_pos_wait() returned, the benchmark clock was stopped and the duration calculated. The calculated duration from the benchmark clock should be close to the time it took for the SQL thread to execute the 100,000 update queries. The 100k queries divided by this duration gave the benchmark metric, reported as Queries Per Second (QPS). Test Reset: The test-reset cycle was implemented as follows: · the slave was stopped · the slave data directory replaced with the previous backup · the slave restarted with the slave threads replication pointer repositioned to the point before the update queries in the binlog. The test could then be repeated with identical set of queries but a different number of slave worker threads, enabling a fair comparison. The Test-Reset cycle was repeated 3 times for 0-24 number of workers and the QPS metric calculated and averaged for each worker count. MySQL Configuration The relevant configuration settings used for MySQL are as follows: binlog-format=STATEMENT relay-log-info-repository=TABLE master-info-repository=TABLE As described in the test procedure, the slave_parallel_workers setting was modified as part of the test logic. The consequence of changing this setting is: 0 worker threads:    - current (i.e. single threaded) sequential mode    - 1 x IO thread and 1 x SQL thread    - SQL thread both reads and executes the events 1 worker thread:    - sequential mode    - 1 x IO thread, 1 x Coordinator SQL thread and 1 x Worker thread    - coordinator reads the event and hands it to the worker who executes 2+ worker threads:    - parallel execution    - 1 x IO thread, 1 x Coordinator SQL thread and 2+ Worker threads    - coordinator reads events and hands them to the workers who execute them Results Figure 1 below shows that Multi-Threaded Slaves deliver ~5x higher replication performance when configured with 10 worker threads, with the load evenly distributed across our 10 x schemas. This result is compared to the current replication implementation which is based on a single SQL thread only (i.e. zero worker threads). Figure 1: 5x Higher Performance with Multi-Threaded Slaves The following figure shows more detailed results, with QPS sampled and reported as the worker threads are incremented. The raw numbers behind this graph are reported in the Appendix section of this post. Figure 2: Detailed Results As the results above show, the configuration does not scale noticably from 5 to 9 worker threads. When configured with 10 worker threads however, scalability increases significantly. The conclusion therefore is that it is desirable to configure the same number of worker threads as schemas. Other conclusions from the results: · Running with 1 worker compared to zero workers just introduces overhead without the benefit of parallel execution. · As expected, having more workers than schemas adds no visible benefit. Aside from what is shown in the results above, testing also demonstrated that the following settings had a very positive effect on slave performance: relay-log-info-repository=TABLE master-info-repository=TABLE For 5+ workers, it was up to 2.3 times as fast to run with TABLE compared to FILE. Conclusion As the results demonstrate, Multi-Threaded Slaves deliver significant performance increases to MySQL replication when handling multiple schemas. This, and the other replication enhancements introduced in MySQL 5.6 are fully available for you to download and evaluate now from the MySQL Developer site (select Development Release tab). You can learn more about MySQL 5.6 from the documentation  Please don’t hesitate to comment on this or other replication blogs with feedback and questions. Appendix – Detailed Results

    Read the article

  • TFS API Change WorkItem CreatedDate And ChangedDate To Historic Dates

    - by Tarun Arora
    There may be times when you need to modify the value of the fields “System.CreatedDate” and “System.ChangedDate” on a work item. Richard Hundhausen has a great blog with ample of reason why or why not you should need to set the values of these fields to historic dates. In this blog post I’ll show you, Create a PBI WorkItem linked to a Task work item by pre-setting the value of the field ‘System.ChangedDate’ to a historic date Change the value of the field ‘System.Created’ to a historic date Simulate the historic burn down of a task type work item in a sprint Explain the impact of updating values of the fields CreatedDate and ChangedDate on the Sprint burn down chart Rules of Play      1. You need to be a member of the Project Collection Service Accounts              2. You need to use ‘WorkItemStoreFlags.BypassRules’ when you instantiate the WorkItemStore service // Instanciate Work Item Store with the ByPassRules flag _wis = new WorkItemStore(_tfs, WorkItemStoreFlags.BypassRules);      3. You cannot set the ChangedDate         - Less than the changed date of previous revision         - Greater than current date Walkthrough The walkthrough contains 5 parts 00 – Required References 01 – Connect to TFS Programmatically 02 – Create a Work Item Programmatically 03 – Set the values of fields ‘System.ChangedDate’ and ‘System.CreatedDate’ to historic dates 04 – Results of our experiment Lets get started………………………………………………… 00 – Required References Microsoft.TeamFoundation.dll Microsoft.TeamFoundation.Client.dll Microsoft.TeamFoundation.Common.dll Microsoft.TeamFoundation.WorkItemTracking.Client.dll 01 – Connect to TFS Programmatically I have a in depth blog post on how to connect to TFS programmatically in case you are interested. However, the code snippet below will enable you to connect to TFS using the Team Project Picker. // Services I need access to globally private static TfsTeamProjectCollection _tfs; private static ProjectInfo _selectedTeamProject; private static WorkItemStore _wis; // Connect to TFS Using Team Project Picker public static bool ConnectToTfs() { var isSelected = false; // The user is allowed to select only one project var tfsPp = new TeamProjectPicker(TeamProjectPickerMode.SingleProject, false); tfsPp.ShowDialog(); // The TFS project collection _tfs = tfsPp.SelectedTeamProjectCollection; if (tfsPp.SelectedProjects.Any()) { // The selected Team Project _selectedTeamProject = tfsPp.SelectedProjects[0]; isSelected = true; } return isSelected; } 02 – Create a Work Item Programmatically In the below code snippet I have create a Product Backlog Item and a Task type work item and then link them together as parent and child. Note – You will have to set the ChangedDate to a historic date when you created the work item. Remember, If you try and set the ChangedDate to a value earlier than last assigned you will receive the following exception… TF26212: Team Foundation Server could not save your changes. There may be problems with the work item type definition. Try again or contact your Team Foundation Server administrator. If you notice below I have added a few seconds each time I have modified the ‘ChangedDate’ just to avoid running into the exception listed above. // Create Linked Work Items and return Ids private static List<int> CreateWorkItemsProgrammatically() { // Instantiate Work Item Store with the ByPassRules flag _wis = new WorkItemStore(_tfs, WorkItemStoreFlags.BypassRules); // List of work items to return var listOfWorkItems = new List<int>(); // Create a new Product Backlog Item var p = new WorkItem(_wis.Projects[_selectedTeamProject.Name].WorkItemTypes["Product Backlog Item"]); p.Title = "This is a new PBI"; p.Description = "Description"; p.IterationPath = string.Format("{0}\\Release 1\\Sprint 1", _selectedTeamProject.Name); p.AreaPath = _selectedTeamProject.Name; p["Effort"] = 10; // Just double checking that ByPassRules is set to true if (_wis.BypassRules) { p.Fields["System.ChangedDate"].Value = Convert.ToDateTime("2012-01-01"); } if (p.Validate().Count == 0) { p.Save(); listOfWorkItems.Add(p.Id); } else { Console.WriteLine(">> Following exception(s) encountered during work item save: "); foreach (var e in p.Validate()) { Console.WriteLine(" - '{0}' ", e); } } var t = new WorkItem(_wis.Projects[_selectedTeamProject.Name].WorkItemTypes["Task"]); t.Title = "This is a task"; t.Description = "Task Description"; t.IterationPath = string.Format("{0}\\Release 1\\Sprint 1", _selectedTeamProject.Name); t.AreaPath = _selectedTeamProject.Name; t["Remaining Work"] = 10; if (_wis.BypassRules) { t.Fields["System.ChangedDate"].Value = Convert.ToDateTime("2012-01-01"); } if (t.Validate().Count == 0) { t.Save(); listOfWorkItems.Add(t.Id); } else { Console.WriteLine(">> Following exception(s) encountered during work item save: "); foreach (var e in t.Validate()) { Console.WriteLine(" - '{0}' ", e); } } var linkTypEnd = _wis.WorkItemLinkTypes.LinkTypeEnds["Child"]; p.Links.Add(new WorkItemLink(linkTypEnd, t.Id) {ChangedDate = Convert.ToDateTime("2012-01-01").AddSeconds(20)}); if (_wis.BypassRules) { p.Fields["System.ChangedDate"].Value = Convert.ToDateTime("2012-01-01").AddSeconds(20); } if (p.Validate().Count == 0) { p.Save(); } else { Console.WriteLine(">> Following exception(s) encountered during work item save: "); foreach (var e in p.Validate()) { Console.WriteLine(" - '{0}' ", e); } } return listOfWorkItems; } 03 – Set the value of “Created Date” and Change the value of “Changed Date” to Historic Dates The CreatedDate can only be changed after a work item has been created. If you try and set the CreatedDate to a historic date at the time of creation of a work item, it will not work. // Lets do a work item effort burn down simulation by updating the ChangedDate & CreatedDate to historic Values private static void WorkItemChangeSimulation(IEnumerable<int> listOfWorkItems) { foreach (var id in listOfWorkItems) { var wi = _wis.GetWorkItem(id); switch (wi.Type.Name) { case "ProductBacklogItem": if (wi.State.ToLower() == "new") wi.State = "Approved"; // Advance the changed date by few seconds wi.Fields["System.ChangedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); // Set the CreatedDate to Changed Date wi.Fields["System.CreatedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); wi.Save(); break; case "Task": // Advance the changed date by few seconds wi.Fields["System.ChangedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); // Set the CreatedDate to Changed date wi.Fields["System.CreatedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(10); wi.Save(); break; } } // A mock sprint start date var sprintStart = DateTime.Today.AddDays(-5); // A mock sprint end date var sprintEnd = DateTime.Today.AddDays(5); // What is the total Sprint duration var totalSprintDuration = (sprintEnd - sprintStart).Days; // How much of the sprint have we already covered var noOfDaysIntoSprint = (DateTime.Today - sprintStart).Days; // Get the effort assigned to our tasks var totalEffortRemaining = QueryTaskTotalEfforRemaining(listOfWorkItems); // Defining how much effort to burn every day decimal dailyBurnRate = totalEffortRemaining / totalSprintDuration < 1 ? 1 : totalEffortRemaining / totalSprintDuration; // we have just created one task var totalNoOfTasks = 1; var simulation = sprintStart; var currentDate = DateTime.Today.Date; // Carry on till effort has been burned down from sprint start to today while (simulation.Date != currentDate.Date) { var dailyBurnRate1 = dailyBurnRate; // A fixed amount needs to be burned down each day while (dailyBurnRate1 > 0) { // burn down bit by bit from all unfinished task type work items foreach (var id in listOfWorkItems) { var wi = _wis.GetWorkItem(id); var isDirty = false; // Set the status to in progress if (wi.State.ToLower() == "to do") { wi.State = "In Progress"; isDirty = true; } // Ensure that there is enough effort remaining in tasks to burn down the daily burn rate if (QueryTaskTotalEfforRemaining(listOfWorkItems) > dailyBurnRate1) { // If there is less than 1 unit of effort left in the task, burn it all if (Convert.ToDecimal(wi["Remaining Work"]) <= 1) { wi["Remaining Work"] = 0; dailyBurnRate1 = dailyBurnRate1 - Convert.ToDecimal(wi["Remaining Work"]); isDirty = true; } else { // How much to burn from each task? var toBurn = (dailyBurnRate / totalNoOfTasks) < 1 ? 1 : (dailyBurnRate / totalNoOfTasks); // Check that the task has enough effort to allow burnForTask effort if (Convert.ToDecimal(wi["Remaining Work"]) >= toBurn) { wi["Remaining Work"] = Convert.ToDecimal(wi["Remaining Work"]) - toBurn; dailyBurnRate1 = dailyBurnRate1 - toBurn; isDirty = true; } else { wi["Remaining Work"] = 0; dailyBurnRate1 = dailyBurnRate1 - Convert.ToDecimal(wi["Remaining Work"]); isDirty = true; } } } else { dailyBurnRate1 = 0; } if (isDirty) { if (Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).Date == simulation.Date) { wi.Fields["System.ChangedDate"].Value = Convert.ToDateTime(wi.Fields["System.ChangedDate"].Value).AddSeconds(20); } else { wi.Fields["System.ChangedDate"].Value = simulation.AddSeconds(20); } wi.Save(); } } } // Increase date by 1 to perform daily burn down by day simulation = Convert.ToDateTime(simulation).AddDays(1); } } // Get the Total effort remaining in the current sprint private static decimal QueryTaskTotalEfforRemaining(List<int> listOfWorkItems) { var unfinishedWorkInCurrentSprint = _wis.GetQueryDefinition( new Guid(QueryAndGuid.FirstOrDefault(c => c.Key == "Unfinished Work").Value)); var parameters = new Dictionary<string, object> { { "project", _selectedTeamProject.Name } }; var q = new Query(_wis, unfinishedWorkInCurrentSprint.QueryText, parameters); var results = q.RunLinkQuery(); var wis = new List<WorkItem>(); foreach (var result in results) { var _wi = _wis.GetWorkItem(result.TargetId); if (_wi.Type.Name == "Task" && listOfWorkItems.Contains(_wi.Id)) wis.Add(_wi); } return wis.Sum(r => Convert.ToDecimal(r["Remaining Work"])); }   04 – The Results If you are still reading, the results are beautiful! Image 1 – Create work item with Changed Date pre-set to historic date Image 2 – Set the CreatedDate to historic date (Same as the ChangedDate) Image 3 – Simulate of effort burn down on a task via the TFS API   Image 4 – The history of changes on the Task. So, essentially this task has burned 1 hour per day Sprint Burn Down Chart – What’s not possible? The Sprint burn down chart is calculated from the System.AuthorizedDate and not the System.ChangedDate/System.CreatedDate. So, though you can change the System.ChangedDate and System.CreatedDate to historic dates you will not be able to synthesize the sprint burn down chart. Image 1 – By changing the Created Date and Changed Date to ‘18/Oct/2012’ you would have expected the burn down to have been impacted, but it won’t be, because the sprint burn down chart uses the value of field ‘System.AuthorizedDate’ to calculate the unfinished work points. The AsOf queries that are used to calculate the unfinished work points use the value of the field ‘System.AuthorizedDate’. Image 2 – Using the above code I burned down 1 hour effort per day over 5 days from the task work item, I would have expected the sprint burn down to show a constant burn down, instead the burn down shows the effort exhausted on the 24th itself. Simply because the burn down is calculated using the ‘System.AuthorizedDate’. Now you would ask… “Can I change the value of the field System.AuthorizedDate to a historic date” Unfortunately that’s not possible! You will run into the exception ValidationException –  “TF26194: The value for field ‘Authorized Date’ cannot be changed.” Conclusion - You need to be a member of the Project Collection Service account group in order to set the fields ‘System.ChangedDate’ and ‘System.CreatedDate’ to historic dates - You need to instantiate the WorkItemStore using the flag ByPassValidation - The System.ChangedDate needs to be set to a historic date at the time of work item creation. You cannot reset the ChangedDate to a date earlier than the existing ChangedDate and you cannot reset the ChangedDate to a date greater than the current date time. - The System.CreatedDate can only be reset after a work item has been created. You cannot set the CreatedDate at the time of work item creation. The CreatedDate cannot be greater than the current date. You can however reset the CreatedDate to a date earlier than the existing value. - You will not be able to synthesize the Sprint burn down chart by changing the value of System.ChangedDate and System.CreatedDate to historic dates, since the burn down chart uses AsOf queries to calculate the unfinished work points which internally uses the System.AuthorizedDate and NOT the System.ChangedDate & System.CreatedDate - System.AuthorizedDate cannot be set to a historic date using the TFS API Read other posts on using the TFS API here… Enjoy!

    Read the article

  • Use a Fake Http Channel to Unit Test with HttpClient

    - by Steve Michelotti
    Applications get data from lots of different sources. The most common is to get data from a database or a web service. Typically, we encapsulate calls to a database in a Repository object and we create some sort of IRepository interface as an abstraction to decouple between layers and enable easier unit testing by leveraging faking and mocking. This works great for database interaction. However, when consuming a RESTful web service, this is is not always the best approach. The WCF Web APIs that are available on CodePlex (current drop is Preview 3) provide a variety of features to make building HTTP REST services more robust. When you download the latest bits, you’ll also find a new HttpClient which has been updated for .NET 4.0 as compared to the one that shipped for 3.5 in the original REST Starter Kit. The HttpClient currently provides the best API for consuming REST services on the .NET platform and the WCF Web APIs provide a number of extension methods which extend HttpClient and make it even easier to use. Let’s say you have a client application that is consuming an HTTP service – this could be Silverlight, WPF, or any UI technology but for my example I’ll use an MVC application: 1: using System; 2: using System.Net.Http; 3: using System.Web.Mvc; 4: using FakeChannelExample.Models; 5: using Microsoft.Runtime.Serialization; 6:   7: namespace FakeChannelExample.Controllers 8: { 9: public class HomeController : Controller 10: { 11: private readonly HttpClient httpClient; 12:   13: public HomeController(HttpClient httpClient) 14: { 15: this.httpClient = httpClient; 16: } 17:   18: public ActionResult Index() 19: { 20: var response = httpClient.Get("Person(1)"); 21: var person = response.Content.ReadAsDataContract<Person>(); 22:   23: this.ViewBag.Message = person.FirstName + " " + person.LastName; 24: 25: return View(); 26: } 27: } 28: } On line #20 of the code above you can see I’m performing an HTTP GET request to a Person resource exposed by an HTTP service. On line #21, I use the ReadAsDataContract() extension method provided by the WCF Web APIs to serialize to a Person object. In this example, the HttpClient is being passed into the constructor by MVC’s dependency resolver – in this case, I’m using StructureMap as an IoC and my StructureMap initialization code looks like this: 1: using StructureMap; 2: using System.Net.Http; 3:   4: namespace FakeChannelExample 5: { 6: public static class IoC 7: { 8: public static IContainer Initialize() 9: { 10: ObjectFactory.Initialize(x => 11: { 12: x.For<HttpClient>().Use(() => new HttpClient("http://localhost:31614/")); 13: }); 14: return ObjectFactory.Container; 15: } 16: } 17: } My controller code currently depends on a concrete instance of the HttpClient. Now I *could* create some sort of interface and wrap the HttpClient in this interface and use that object inside my controller instead – however, there are a few why reasons that is not desirable: For one thing, the API provided by the HttpClient provides nice features for dealing with HTTP services. I don’t really *want* these to look like C# RPC method calls – when HTTP services have REST features, I may want to inspect HTTP response headers and hypermedia contained within the message so that I can make intelligent decisions as to what to do next in my workflow (although I don’t happen to be doing these things in my example above) – this type of workflow is common in hypermedia REST scenarios. If I just encapsulate HttpClient behind some IRepository interface and make it look like a C# RPC method call, it will become difficult to take advantage of these types of things. Second, it could get pretty mind-numbing to have to create interfaces all over the place just to wrap the HttpClient. Then you’re probably going to have to hard-code HTTP knowledge into your code to formulate requests rather than just “following the links” that the hypermedia in a message might provide. Third, at first glance it might appear that we need to create an interface to facilitate unit testing, but actually it’s unnecessary. Even though the code above is dependent on a concrete type, it’s actually very easy to fake the data in a unit test. The HttpClient provides a Channel property (of type HttpMessageChannel) which allows you to create a fake message channel which can be leveraged in unit testing. In this case, what I want is to be able to write a unit test that just returns fake data. I also want this to be as re-usable as possible for my unit testing. I want to be able to write a unit test that looks like this: 1: [TestClass] 2: public class HomeControllerTest 3: { 4: [TestMethod] 5: public void Index() 6: { 7: // Arrange 8: var httpClient = new HttpClient("http://foo.com"); 9: httpClient.Channel = new FakeHttpChannel<Person>(new Person { FirstName = "Joe", LastName = "Blow" }); 10:   11: HomeController controller = new HomeController(httpClient); 12:   13: // Act 14: ViewResult result = controller.Index() as ViewResult; 15:   16: // Assert 17: Assert.AreEqual("Joe Blow", result.ViewBag.Message); 18: } 19: } Notice on line #9, I’m setting the Channel property of the HttpClient to be a fake channel. I’m also specifying the fake object that I want to be in the response on my “fake” Http request. I don’t need to rely on any mocking frameworks to do this. All I need is my FakeHttpChannel. The code to do this is not complex: 1: using System; 2: using System.IO; 3: using System.Net.Http; 4: using System.Runtime.Serialization; 5: using System.Threading; 6: using FakeChannelExample.Models; 7:   8: namespace FakeChannelExample.Tests 9: { 10: public class FakeHttpChannel<T> : HttpClientChannel 11: { 12: private T responseObject; 13:   14: public FakeHttpChannel(T responseObject) 15: { 16: this.responseObject = responseObject; 17: } 18:   19: protected override HttpResponseMessage Send(HttpRequestMessage request, CancellationToken cancellationToken) 20: { 21: return new HttpResponseMessage() 22: { 23: RequestMessage = request, 24: Content = new StreamContent(this.GetContentStream()) 25: }; 26: } 27:   28: private Stream GetContentStream() 29: { 30: var serializer = new DataContractSerializer(typeof(T)); 31: Stream stream = new MemoryStream(); 32: serializer.WriteObject(stream, this.responseObject); 33: stream.Position = 0; 34: return stream; 35: } 36: } 37: } The HttpClientChannel provides a Send() method which you can override to return any HttpResponseMessage that you want. You can see I’m using the DataContractSerializer to serialize the object and write it to a stream. That’s all you need to do. In the example above, the only thing I’ve chosen to do is to provide a way to return different response objects. But there are many more features you could add to your own re-usable FakeHttpChannel. For example, you might want to provide the ability to add HTTP headers to the message. You might want to use a different serializer other than the DataContractSerializer. You might want to provide custom hypermedia in the response as well as just an object or set HTTP response codes. This list goes on. This is the just one example of the really cool features being added to the next version of WCF to enable various HTTP scenarios. The code sample for this post can be downloaded here.

    Read the article

  • Inventory Management concepts in XNA game

    - by user1332755
    I am trying to code the inventory system in my first real game so I have very little experience in both c# and game engine development. Basically, I need some general guidance and tips with how to structure and organize these sorts of systems. Please tell me if I am on the right track or not before I get too deep into making some badly structured system. It's fine if you don't feel like looking through my code, suggestions about general structure would also be appreciated. What I am aiming to end up with is some sort of system like Minecraft or Terraria. It must include: main inventory GUI (items can be dragged and placed in whatever slot desired Itembar outside of the main inventory which can be assigned to certain items the ability to use items from either location So far, I have 4 main classes: Inventory holds the general info and methods, inventoryslot holds info for individual slots, Itembar holds all info and methods for itself, and finally, ItemManager to manage interactions between the two and hold a master list of items. So far, my itembar works perfectly and interacts well with mousedragging items into and out of it as well as activating the item effect. Here is the code I have so far: (there is a lot but I will try to keep it relevant) This is the code for the itembar on the main screen: class Itembar { public Texture2D itembarfull, iSelected; public static Rectangle itembar = new Rectangle(5, 218, 40, 391); public Rectangle box1 = new Rectangle(itembar.X, 218, 40, 40); //up to 10 Rectangles for each slot public int Selected = 0; private ItemManager manager; public Itembar(Texture2D texture, Texture2D texture3, ItemManager mann) { itembarfull = texture; iSelected = texture3; manager = mann; } public void Update(GameTime gametime) { } public void Draw(SpriteBatch spriteBatch) { spriteBatch.Draw( itembarfull, new Vector2 (itembar.X, itembar.Y), null, Color.White, 0.0f, Vector2.Zero, 1.0f, SpriteEffects.None, 1.0f); if (Selected == 1) spriteBatch.Draw(iSelected, new Rectangle(box1.X-3, box1.Y-3, box1.Width+6, box1.Height+6), Color.White); //goes up to 10 slots } public int Box1Query() { foreach (Item item in manager.items) { if(box1.Contains(item.BoundingBox)) return manager.items.IndexOf(item); } return 999; } //10 different box queries It is working fine right now. I just put an Item in there and the box will query things like the item's effects, stack number, consumable or not etc...This one is basically almost complete. Here is the main inventory class: class Inventory { public bool isActive; public List<Rectangle> mainSlots = new List<Rectangle>(24); public List<InventorySlot> mainSlotscheck = new List<InventorySlot>(24); public static Rectangle inv = new Rectangle(841, 469, 156, 231); public Rectangle invfull = new Rectangle(inv.X, inv.Y, inv.Width, inv.Height); public Rectangle inv1 = new Rectangle(inv.X + 4, inv.Y +3, 32, 32); //goes up to inv24 resulting in a 6x4 grid of Rectangles public Inventory() { mainSlots.Add(inv1); mainSlots.Add(inv2); mainSlots.Add(inv3); mainSlots.Add(inv4); //goes up to 24 foreach (Rectangle slot in mainSlots) mainSlotscheck.Add(new InventorySlot(slot)); } //update and draw methods are empty because im not too sure what to put there public int LookforfreeSlot() { int slotnumber = 999; for (int x = 0; x < mainSlots.Count; x++) { if (mainSlotscheck[x].isFree) { slotnumber = x; break; } } return slotnumber; } } } LookforFreeSlot() method is meant to be called when I do AddtoInventory(). I'm kinda stumped about what other things I need to put in this class. Here is the inventorySlot class: (its main purpose is to check the bool "isFree" to see whether or not something already occupies the slot. But i guess it can also do other stuff like get item info.) class InventorySlot { public int X, Y; public int Width = 32, Height = 32; public Vector2 Position; public int slotnumber; public bool free = true; public int? content = null; public bool isFree { get { return free; } set { free = value; } } public InventorySlot(Rectangle slot) { slot = new Rectangle(X, Y, Width, Height); } } } Finally, here is the ItemManager (I am omitting the master list because it is too long) class ItemManager { public List<Item> items = new List<Item>(20); public List<Item> inventory1 = new List<Item>(24); public List<Item> inventory2 = new List<Item>(24); public List<Item> inventory3 = new List<Item>(24); public List<Item> inventory4 = new List<Item>(24); public Texture2D icon, filta; private Rectangle msRect; MouseState mouseState; public int ISelectedIndex; Inventory inventory; SpriteFont font; public void GenerateItems() { items.Add(new Item(new Rectangle(0, 0, 32, 32), icon, font)); items[0].name = "Grass Chip"; items[0].itemID = 0; items[0].consumable = true; items[0].stackable = true; items[0].maxStack = 99; items.Add(new Item(new Rectangle(32, 0, 32, 32), icon, font)); //master list continues. it will generate all items in the game; } public ItemManager(Inventory inv, Texture2D itemsheet, Rectangle mouseRectt, MouseState ms, Texture2D fil, SpriteFont f) { icon = itemsheet; msRect = mouseRectt; filta = fil; mouseState = ms; inventory = inv; font = f; } //once again, no update or draw public void mousedrag() { items[0].DestinationRect = new Rectangle (msRect.X, msRect.Y, 32, 32); items[0].dragging = true; } public void AddtoInventory(Item item) { int index = inventory.LookforfreeSlot(); if (index == 999) return; item.DestinationRect = inventory.mainSlots[index]; inventory.mainSlotscheck[index].content = item.itemID; inventory.mainSlotscheck[index].isFree = false; item.IsActive = true; } } } The mousedrag works pretty well. AddtoInventory doesn't work because LookforfreeSlot doesn't work. Relevant code from the main program: When I want to add something to the main inventory, I do something like this: foreach (Particle ether in ether1.ethers) { if (ether.isCollected) itemmanager.AddtoInventory(itemmanager.items[14]); } This turned out to be much longer than I had expected :( But I hope someone is interested enough to comment.

    Read the article

< Previous Page | 509 510 511 512 513 514 515 516 517 518 519 520  | Next Page >