Search Results

Search found 23517 results on 941 pages for 'visual basic'.

Page 745/941 | < Previous Page | 741 742 743 744 745 746 747 748 749 750 751 752  | Next Page >

  • Perform Unit Conversions with the Windows 7 Calculator

    - by Matthew Guay
    Want to easily convert area, volume, temperature, and many other units?  With the Calculator in Windows 7, it’s easy to convert most any unit into another. The New Calculator in Windows 7 Calculator received a visual overhaul in Windows 7, but at first glance it doesn’t seem to have any new functionality.  Here’s Windows 7’s Calculator on the left, with Vista’s calculator on the right.   But, looks can be deceiving.  Window’s 7’s calculator has lots of new exciting features.  Let’s try them out.  Simply type Calculator in the start menu search. To uncover the new features, click the View menu.  Here you can select many different modes, including Unit Conversion mode which we will look at. When you select the Unit Conversion mode, the Calculator will expand with a form on the left side. This conversions pane has 3 drop-down menus.  From the top one, select the type of unit you want to convert. In the next two menus, select which values you wish to convert to and from.  For instance, here we selected Temperature in the first menu, Degrees Fahrenheit in the second menu, and Degrees Celsius in the third menu. Enter the value you wish to convert in the From box, and the conversion will automatically appear in the bottom box. The Calculator contains dozens of conversion values, including more uncommon ones.  So if you’ve ever wanted to know how many US gallons are in a UK gallon, or how many knots a supersonic jet travels in an hour, this is a great tool for you!   Conclusion Windows 7 is filled with little changes that give you an all-around better experience in Windows to help you work more efficiently and productively.  With the new features in the Calculator, you just might feel a little smarter, too! Similar Articles Productive Geek Tips Add Windows Calculator to the Excel 2007 Quick Launch ToolbarEnjoy Quick & Easy Unit Conversion with Convert for WindowsCalculate with Qalculate on LinuxDisable the Annoying “This device can perform faster” Balloon Message in Windows 7Get stats on your Ruby on Rails code TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Install, Remove and HIDE Fonts in Windows 7 Need Help with Your Home Network? Awesome Lyrics Finder for Winamp & Windows Media Player Download Videos from Hulu Pixels invade Manhattan Convert PDF files to ePub to read on your iPad

    Read the article

  • SQL SERVER – Introduction to Adaptive ETL Tool – How adaptive is your ETL?

    - by pinaldave
    I am often reminded by the fact that BI/data warehousing infrastructure is very brittle and not very adaptive to change. There are lots of basic use cases where data needs to be frequently loaded into SQL Server or another database. What I have found is that as long as the sources and targets stay the same, SSIS or any other ETL tool for that matter does a pretty good job handling these types of scenarios. But what happens when you are faced with more challenging scenarios, where the data formats and possibly the data types of the source data are changing from customer to customer?  Let’s examine a real life situation where a health management company receives claims data from their customers in various source formats. Even though this company supplied all their customers with the same claims forms, they ended up building one-off ETL applications to process the claims for each customer. Why, you ask? Well, it turned out that the claims data from various regional hospitals they needed to process had slightly different data formats, e.g. “integer” versus “string” data field definitions.  Moreover the data itself was represented with slight nuances, e.g. “0001124” or “1124” or “0000001124” to represent a particular account number, which forced them, as I eluded above, to build new ETL processes for each customer in order to overcome the inconsistencies in the various claims forms.  As a result, they experienced a lot of redundancy in these ETL processes and recognized quickly that their system would become more difficult to maintain over time. So imagine for a moment that you could use an ETL tool that helps you abstract the data formats so that your ETL transformation process becomes more reusable. Imagine that one claims form represents a data item as a string – acc_no(varchar) – while a second claims form represents the same data item as an integer – account_no(integer). This would break your traditional ETL process as the data mappings are hard-wired.  But in a world of abstracted definitions, all you need to do is create parallel data mappings to a common data representation used within your ETL application; that is, map both external data fields to a common attribute whose name and type remain unchanged within the application. acc_no(varchar) is mapped to account_number(integer) expressor Studio first claim form schema mapping account_no(integer) is also mapped to account_number(integer) expressor Studio second claim form schema mapping All the data processing logic that follows manipulates the data as an integer value named account_number. Well, these are the kind of problems that that the expressor data integration solution automates for you.  I’ve been following them since last year and encourage you to check them out by downloading their free expressor Studio ETL software. Reference: Pinal Dave (http://blog.SQLAuthority.com) Filed under: Business Intelligence, Pinal Dave, PostADay, SQL, SQL Authority, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology Tagged: ETL, SSIS

    Read the article

  • SQL SERVER – Importance of User Without Login – T-SQL Demo Script

    - by pinaldave
    Earlier I wrote a blog post about SQL SERVER – Importance of User Without Login and my friend and SQL Expert Vinod Kumar has written excellent follow up blog post about Contained Databases inside SQL Server 2012. Now lots of people asked me if I can also explain the same concept again so here is the small demonstration for it. Let me show you how login without user can help. Before we continue on this subject I strongly recommend that you read my earlier blog post here. In following demo I am going to demonstrate following situation. Login using the System Admin account Create a user without login Checking Access Impersonate the user without login Checking Access Revert Impersonation Give Permission to user without login Impersonate the user without login Checking Access Revert Impersonation Clean up USE [AdventureWorks2012] GO -- Step 1 : Login using the SA -- Step 2 : Create Login Less User CREATE USER [testguest] 9ITHOUT LOGIN WITH DEFAULT_SCHEMA=[dbo] GO -- Step 3 : Checking access to Tables SELECT * FROM sys.tables; -- Step 4 : Changing the execution contest EXECUTE AS USER   = 'testguest'; GO -- Step 5 : Checking access to Tables SELECT * FROM sys.tables; GO -- Step 6 : Reverting Permissions REVERT; -- Step 7 : Giving more Permissions to testguest user GRANT SELECT ON [dbo].[ErrorLog] TO [testguest]; GRANT SELECT ON [dbo].[DatabaseLog] TO [testguest]; GO -- Step 8 : Changing the execution contest EXECUTE AS USER   = 'testguest'; GO -- Step 9 : Checking access to Tables SELECT * FROM sys.tables; GO -- Step 10 : Reverting Permissions REVERT; GO -- Step 11: Clean up DROP USER [testguest]Step 3 GO Here is the step 9 we will be able to notice that how a user without login gets access to some of the data/object which we gave permission. What I am going to prove with this example? Well there can be different rights with different account. Once the login is authenticated it makes sense for impersonating a user with only necessary permissions to be used for further operation. Again this is very basic and fundamental example. There are lots of more points to be discussed as we go in future posts. Just do not take this blog post as a template and implement everything as it is. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: PostADay, SQL, SQL Authority, SQL Query, SQL Security, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • How to add event receiver to SharePoint2010 content type programmatically

    - by ybbest
    Today , I’d like to show how to add event receiver to How to add event receiver to SharePoint2010 content type programmatically. 1. Create empty SharePoint Project and add a class called ItemContentTypeEventReceiver and make it inherit from SPItemEventReceiver and implement your logic as below public class ItemContentTypeEventReceiver : SPItemEventReceiver { private bool eventFiringEnabledStatus; public override void ItemAdded(SPItemEventProperties properties) { base.ItemAdded(properties); UpdateTitle(properties); } private void UpdateTitle(SPItemEventProperties properties) { SPListItem addedItem = properties.ListItem; string enteredTitle = addedItem["Title"] as string; addedItem["Title"] = enteredTitle + " Updated"; DisableItemEventsScope(); addedItem.Update(); EnableItemEventsScope(); } public override void ItemUpdated(SPItemEventProperties properties) { base.ItemUpdated(properties); UpdateTitle(properties); } private void DisableItemEventsScope() { eventFiringEnabledStatus = EventFiringEnabled; EventFiringEnabled = false; } private void EnableItemEventsScope() { eventFiringEnabledStatus = EventFiringEnabled; EventFiringEnabled = true; } } 2.Create a Site or Web(depending or your requirements) scoped feature and implement your feature event handler as below: public override void FeatureActivated(SPFeatureReceiverProperties properties) { SPWeb web = GetFeatureWeb(properties); //http://karinebosch.wordpress.com/walkthroughs/event-receivers-theory/ string assemblyName =  System.Reflection.Assembly.GetExecutingAssembly().FullName; const string className = "YBBEST.AddEventReceiverToContentType.ItemContentTypeEventReceiver"; SPContentType contentType= web.ContentTypes["Item"]; AddEventReceiverToContentType(className, contentType, assemblyName, SPEventReceiverType.ItemAdded, SPEventReceiverSynchronization.Asynchronous); AddEventReceiverToContentType(className, contentType, assemblyName, SPEventReceiverType.ItemUpdated, SPEventReceiverSynchronization.Asynchronous); contentType.Update(); } protected static void AddEventReceiverToContentType(string className, SPContentType contentType, string assemblyName, SPEventReceiverType eventReceiverType, SPEventReceiverSynchronization eventReceiverSynchronization) { if (className == null) throw new ArgumentNullException("className"); if (contentType == null) throw new ArgumentNullException("contentType"); if (assemblyName == null) throw new ArgumentNullException("assemblyName"); SPEventReceiverDefinition eventReceiver = contentType.EventReceivers.Add(); eventReceiver.Synchronization = eventReceiverSynchronization; eventReceiver.Type = eventReceiverType; eventReceiver.Assembly = assemblyName; eventReceiver.Class = className; eventReceiver.Update(); } 3.Deploy your solution and now you have a event receiver that attached to the Item contentType. You can download the complete source code here.You can also check how to add event receiver to a list using SharePoint event receiver item in Visual Studio2010 in my previous blog.

    Read the article

  • How to update all the SSIS packages&rsquo; Connection Managers in a BIDS project with PowerShell

    - by Luca Zavarella
    During the development of a BI solution, we all know that 80% of the time is spent during the ETL (Extract, Transform, Load) phase. If you use the BI Stack Tool provided by Microsoft SQL Server, this step is accomplished by the development of n Integration Services (SSIS) packages. In general, the number of packages made ??in the ETL phase for a non-trivial solution of BI is quite significant. An SSIS package, therefore, extracts data from a source, it "hammers" :) the data and then transfers it to a specific destination. Very often it happens that the connection to the source data is the same for all packages. Using Integration Services, this results in having the same Connection Manager (perhaps with the same name) for all packages: The source data of my BI solution comes from an Helper database (HLP), then, for each package tha import this data, I have the HLP Connection Manager (the use of a Shared Data Source is not recommended, because the Connection String is wired and therefore you have to open the SSIS project and use the proper wizard change it...). In order to change the HLP Connection String at runtime, we could use the Package Configuration, or we could run our packages with DTLoggedExec by Davide Mauri (a must-have if you are developing with SQL Server 2005/2008). But my need was to change all the HLP connections in all packages within the SSIS Visual Studio project, because I had to version them through Team Foundation Server (TFS). A good scribe with a lot of patience should have changed by hand all the connections by double-clicking the HLP Connection Manager of each package, and then changing the referenced server/database: Not being endowed with such virtues :) I took just a little of time to write a small script in PowerShell, using the fact that a SSIS package (a .dtsx file) is nothing but an xml file, and therefore can be changed quite easily. I'm not a guru of PowerShell, but I managed more or less to put together the following lines of code: $LeftDelimiterString = "Initial Catalog=" $RightDelimiterString = ";Provider=" $ToBeReplacedString = "AstarteToBeReplaced" $ReplacingString = "AstarteReplacing" $MainFolder = "C:\MySSISPackagesFolder" $files = get-childitem "$MainFolder" *.dtsx `       | Where-Object {!($_.PSIsContainer)} foreach ($file in $files) {       (Get-Content $file.FullName) `             | % {$_ -replace "($LeftDelimiterString)($ToBeReplacedString)($RightDelimiterString)", "`$1$ReplacingString`$3"} ` | Set-Content $file.FullName; } The script above just opens any SSIS package (.dtsx) in the supplied folder, then for each of them goes in search of the following text: Initial Catalog=AstarteToBeReplaced;Provider= and it replaces the text found with this: Initial Catalog=AstarteReplacing;Provider= I don’t enter into the details of each cmdlet used. I leave the reader to search for these details. Alternatively, you can use a specific object model exposed in some .NET assemblies provided by Integration Services, or you can use the Pacman utility: Enjoy! :) P.S. Using TFS as versioning system, before running the script I checked out the packages and, after the script executed succesfully, I checked in them.

    Read the article

  • Predicting Likelihood of Click with Multiple Presentations

    - by Michel Adar
    When using predictive models to predict the likelihood of an ad or a banner to be clicked on it is common to ignore the fact that the same content may have been presented in the past to the same visitor. While the error may be small if the visitors do not often see repeated content, it may be very significant for sites where visitors come repeatedly. This is a well recognized problem that usually gets handled with presentation thresholds – do not present the same content more than 6 times. Observations and measurements of visitor behavior provide evidence that something better is needed. Observations For a specific visitor, during a single session, for a banner in a not too prominent space, the second presentation of the same content is more likely to be clicked on than the first presentation. The difference can be 30% to 100% higher likelihood for the second presentation when compared to the first. That is, for example, if the first presentation has an average click rate of 1%, the second presentation may have an average CTR of between 1.3% and 2%. After the second presentation the CTR stays more or less the same for a few more presentations. The number of presentations in this plateau seems to vary by the location of the content in the page and by the visual attraction of the content. After these few presentations the CTR starts decaying with a curve that is very well approximated by an exponential decay. For example, the 13th presentation may have 90% the likelihood of the 12th, and the 14th has 90% the likelihood of the 13th. The decay constant seems also to depend on the visibility of the content. Modeling Options Now that we know the empirical data, we can propose modeling techniques that will correctly predict the likelihood of a click. Use presentation number as an input to the predictive model Probably the most straight forward approach is to add the presentation number as an input to the predictive model. While this is certainly a simple solution, it carries with it several problems, among them: If the model learns on each case, repeated non-clicks for the same content will reinforce the belief of the model on the non-clicker disproportionately. That is, the weight of a person that does not click for 200 presentations of an offer may be the same as 100 other people that on average click on the second presentation. The effect of the presentation number is not a customer characteristic or a piece of contextual data about the interaction with the customer, but it is contextual data about the content presented. Models tend to underestimate the effect of the presentation number. For these reasons it is not advisable to use this approach when the average number of presentations of the same content to the same person is above 3, or when there are cases of having the presentation number be very large, in the tens or hundreds. Use presentation number as a partitioning attribute to the predictive model In this approach we essentially build a separate predictive model for each presentation number. This approach overcomes all of the problems in the previous approach, nevertheless, it can be applied only when the volume of data is large enough to have these very specific sub-models converge.

    Read the article

  • Reverse-engineer SharePoint fields, content types and list instance—Part3

    - by ybbest
    Reverse-engineer SharePoint fields, content types and list instance—Part1 Reverse-engineer SharePoint fields, content types and list instance—Part2 Reverse-engineer SharePoint fields, content types and list instance—Part3 In Part 1 and Part 2 of this series, I demonstrate how to reverse engineer SharePoint fields, content types. In this post I will cover how to include lookup fields in the content type and create list instance using these content types. Firstly, I will cover how to create list instance and bind the custom content type to the custom list. 1. Create a custom list using list Instance item in visual studio and select custom list. 2. In the feature receiver add the Department content type to Department list and remove the item content type. C# AddContentTypeToList(web, “Department”, ” Department”); private void AddContentTypeToList(SPWeb web,string listName, string contentTypeName) { SPList list = web.Lists.TryGetList(listName); list.OnQuickLaunch = true; list.ContentTypesEnabled = true; list.Update(); SPContentType employeeContentType = web.ContentTypes[contentTypeName]; list.ContentTypes.Add(employeeContentType); list.ContentTypes["Item"].Delete(); list.Update(); } Next, I will cover how to create the lookup fields. The difference between creating a normal field and lookup fields is that you need to create the lookup fields after the lists are created. This is because the lookup fields references fields from the foreign list. 1. In your solution, you need to create a feature that deploys the list before deploying the lookup fields. 2. You need to write the following code in the feature receiver to add the lookup columns in the ContentType. C# //add the lookup fields SPFieldLookup departmentField = EnsureLookupField(currentWeb, “YBBESTDepartment”, currentWeb.Lists["DepartmentList"].ID, “Title”); //add to the content types SPContentType employeeContentType = currentWeb.ContentTypes["Employee"]; //Add the lookup fields as SPFieldLink employeeContentType.FieldLinks.Add(new SPFieldLink(departmentField)); employeeContentType.Update(true); private static SPFieldLookup EnsureLookupField(SPWeb currentWeb, String sFieldName, Guid LookupListID, String sLookupField) { //add the lookup fields SPFieldLookup lookupField = null; try { lookupField = currentWeb.Fields[sFieldName] as SPFieldLookup; } catch (Exception e) { } if (lookupField == null) { currentWeb.Fields.AddLookup(sFieldName, LookupListID, true); currentWeb.Update(); lookupField = currentWeb.Fields[sFieldName] as SPFieldLookup; lookupField.LookupField = sLookupField; lookupField.Group = “YBBEST”; lookupField.Required = true; lookupField.Update(); } return lookupField; }

    Read the article

  • [MISC GEEKERY] Lucid Lynx to Come Loaded with Ubuntu One Music Store

    - by Vivek
    Ubuntu 10.04 (code name Lucid Lynx) will come loaded with the Ubuntu One music store. Rhythmbox will have the Ubuntu One music store integrated in it. It’ll also allow users to download purchased music to their local machine. Ubuntu One Music Store Users will be able to access Ubuntu One music store from the sidebar of Rhythmbox. The music store is a web page that opens in the Rhythmbox player. There are albums listed on the home page of the Ubuntu One music store page. Ubuntu One music store is powered by 7digital, which is a leading digital B2B media delivery company based in London and operating globally. Canonical, the company behind Ubuntu, has partnered with 7digital to bring the music store to it’s users, integrating it with Rhythmbox and it’s cloud storage service UbuntuOne which was launched last year. The home screen of the Ubuntu One music store displays popular albums and functionality to browse and search. You can search for Artists, Tracks, Albums, or a combination of all three. Users will also be able to browse the store alphabetically, or based on different music genres. Once you select a specific artist, all their available albums are arranged in a grid. Once an album is selected, you’ll will be able to download specific songs or the whole album. You’ll also be allowed to preview different songs for 60 seconds. You’ll be able to buy tracks using a credit card or with PayPal. The purchased tracks will be visible under Library \ Purchased from Ubuntu One. The downloaded tracks are also synced with your UbuntuOne account. This means that you’ll be able to access your tracks from any where on the web. The default UbuntuOne account comes with 2 GB free storage, however, you can also purchase additional space if you need it.   All the music is in mp3 format which is not supported by default in Ubuntu. However, you can get mp3 playback functionality using GStreamer multimedia framework. Conclusion All in all the Ubuntu One music store is a positive move to enhance the user experience and also increase the popularity of Canonical in bringing Ubuntu closer to regular users. This would also provide Canonical to make some revenue in collaboration with 7digital. Ubuntu One Music Store Wiki Similar Articles Productive Geek Tips Install GIMP 2.7.1 on Lucid Lynx using PPAExaile 0.3.0 is a Music Player for UbuntuHow to install Spotify in Ubuntu 9.10 using WineAdding extra Repositories on UbuntuSpeed Up Amarok With Large Music Collections TouchFreeze Alternative in AutoHotkey The Icy Undertow Desktop Windows Home Server – Backup to LAN The Clear & Clean Desktop Use This Bookmarklet to Easily Get Albums Use AutoHotkey to Assign a Hotkey to a Specific Window Latest Software Reviews Tinyhacker Random Tips DVDFab 6 Revo Uninstaller Pro Registry Mechanic 9 for Windows PC Tools Internet Security Suite 2010 Open Multiple Links At One Go NachoFoto Searches Images in Real-time Office 2010 Product Guides Google Maps Place marks – Pizza, Guns or Strip Clubs Monitor Applications With Kiwi LocPDF is a Visual PDF Search Tool

    Read the article

  • Silverlight Cream for May 02, 2010 -- #854

    - by Dave Campbell
    In this Issue: Michael Washington, Jason Young(-2-, -3-), Phil Middlemiss, Jeremy Likness, Victor Gaudioso, Kunal Chowdhury, Antoni Dol, and Jacek Ciereszko(-2-). Shoutout: Victor Gaudioso has aggregated All of My Silverlight Video Tutorials in One Place (revised again 05.02.10) From SilverlightCream.com: Unit Testing A Silverlight 'Simplified MVVM' Modal Popup Michael Washington's latest 'Simplified MVVM' post is published at The Code Project and is on Unit Testing with MVVM. Input Localization in Silverlight without IValueConverter Jason Young sent me some links to posts I've not seen... this first one is on localization by using the Language property of the Root Visual. MVVM – The Model - Part 1 – INotifyPropertyChanged Jason Young's next archive post is the first of a series on MVVM and Silverlight 4 ... implementing a simple ViewModel base class. Silverlight, WCF, and ASP.Net Configuration Gotchas Jason Young worked at tracking down the answers to some forum questions and in the process has produced a post of 'gotchas' with using WCF in Silverlight. A Chrome and Glass Theme - Part 5 Phil Middlemiss has part 5 of his Chrome and Glass Theme tutorial up ... in this one, he's looking at the Progress Bar and Slider. Download the files and play along. Silverlight Out of Browser (OOB) Versions, Images, and Isolated Storage Jeremy Likness has a post up responding to his 3 major questions about OOB apps, and he has to code up for the sample too. New Silverlight Video Tutorial: How to Make a Slide In/Out Navigation Bar – All in Blend Victor Gaudioso's latest video tutorial is on building a Behavior for a Slide in/out Navigation bar... kinda like the menu sliders on my GlyphMap Utility... only easier! Command Binding in Silverlight 4 (Step-by-Step) Kunal Chowdhury has another post up at DotNetFunda, and this time he's talking about Command Binding in Silverlight 4 with an eye toward MVVM usage. The Silverlight PageCurl implementation Antoni Dol has a post up about doing a Page Curl effect in Silverlight. He has a manual up on the effect and full application code. How to center and scale Silverlight applications using ViewBox control Jacek Ciereszko has a couple posts up about centering and scaling your app with the ViewBox control. This first one is a code solution. Source is available, as is a Polish version. Silverlight Center And Scale Behavior Jacek Ciereszko's 2nd post, he provides a Behavior that handles the scaling and centering of the previous post. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • Stackify Aims to Put More ‘Dev’ in ‘DevOps’

    - by Matt Watson
    Originally published on VisualStudioMagazine.com on 8/22/2012 by Keith Ward.The Kansas City-based startup wants to make it easier for developers to examine the network stack and find problems in code.The first part of “DevOps” is “Dev”. But according to Matt Watson, Devs aren’t connected enough with Ops, and it’s time that changed.He founded the startup company Stackify earlier this year to do something about it. Stackify gives developers unprecedented access to the IT side of the equation, Watson says, without putting additional burden on the system and network administrators who ultimately ensure the health of the environment.“We need a product designed for developers, with the goal of getting them more involved in operations and app support. Now, there’s next to nothing designed for developers,” Watson says. Stackify allows developers to search the network stack to troubleshoot problems in their software that might otherwise take days of coordination between development and IT teams to solve.Stackify allows developers to search log files, configuration files, databases and other infrastructure to locate errors. A key to this is that the developers are normally granted read-only access, soothing admin fears that developers will upload bad code to their servers.Implementation starts with data collection on the servers. Among the information gleaned is application discovery, server monitoring, file access, and other data collection, according to Stackify’s Web site. Watson confirmed that Stackify works seamlessly with virtualized environments as well.Although the data collection software must be installed on Windows servers, it can monitor both Windows and Linux servers. Once collection’s finished, developers have the kind of information they need, without causing heartburn for the IT staff.Stackify is a 100 percent cloud-based service. The company uses Windows Azure for hosting, a decision Watson’s happy with. With Azure, he says, “It’s nice to have all the dev tools like cache and table storage.” Although there have been a few glitches here and there with the service, it’s run very smoothly for the most part, he adds.Stackify is currently in a closed beta, with a public release scheduled for October. Watson says that pricing is expected to be $25 per month, per server, with volume discounts available. He adds that the target audience is companies with at least five developers.Watson founded Stackify after selling his last company, VinSolutions, to AutoTrader.com for “close to $150 million”, according to press accounts. Watson has since  founded the Watson Technology Group, which focuses on angel investing.About the Author: Keith Ward is the editor in chief of Visual Studio Magazine.

    Read the article

  • cocos2dx - Custom Fragment Shader and CCRenderTexture

    - by saiy2k
    I have a CCRenderTexture that is filled with a sprite when the scene is loaded, as follows, canvas = CCRenderTexture::create(this->getContentSize().width, this->getContentSize().height); canvas->setPosition(data->position); canvas->beginWithClear(0.0, 0.0, 0.0, 0); this->visit(); canvas->end(); The above code is written within a class, which derives from CCSprite (Hence this). Then, in another function applyShader(), I create a sprite named splat, from the texture of CCRenderTexture *canvas. Thus splat will contain the whole texture of canvas. Now I apply a custom fragment shader to the splat by calling the function splat->renderShader(), which will modify some small portion of the whole texture. Then I draw the modified texture back to the CCRenderTexture *canvas. Hence, applyShader() will * take a texture from CCRenderTexture, * create a sprite based on it, * apply a fragment shader to it * and draw the modified texture back to CCRenderTexture. This applyShader() will be called repetitively and its code is as follows: splat = Splat::createWithTexture(art->canvas->getSprite()->getTexture()); splat->renderShader(); art->canvas->begin(); splat->visit(); art->canvas->end(); My shader code is (nothing fancy) precision mediump float; varying vec2 v_texCoord; uniform sampler2D u_texture; uniform sampler2D u_colorRampTexture; uniform float params[5]; void main() { gl_FragColor = texture2D(u_texture, v_texCoord); return; } So, with the above code I expect the original sprite this to get rendered over and over again without any visual changes. But on each call to applyShader(), the texture is getting stretched a little and the stretched image is getting rendered. After some 10 calls, the image gets so distorted. Can someone please tell me where I am going wrong? Thanks :-) PS: All code shown here is partial, not complete code. Edit: Adding Screens Update: The problem has nothing to do with shaders it seems. It happens even when I dont call renderShader(). The actual lines of code is: splat = Splat::createWithTexture(art->canvas->getSprite()->getTexture()); splat->setPosition( ccp( art->getContentSize().width * 0.5, art->getContentSize().height * 0.5 ) ); splat->setFlipY(true); art->canvas->begin(); splat->visit(); art->canvas->end();

    Read the article

  • Silverlight Cream for April 02, 2010 -- #828

    - by Dave Campbell
    In this Issue: Phil Middlemiss, Robert Kozak, Kathleen Dollard, Avi Pilosof, Nokola, Jeff Wilcox, David Anson, Timmy Kokke, Tim Greenfield, and Josh Smith. Shoutout: SmartyP has additional info up on his WP7 Pivot app: Preview of My Current Windows Phone 7 Pivot Work From SilverlightCream.com: A Chrome and Glass Theme - Part I Phil Middlemiss is starting a tutorial series on building a new theme for Silverlight, in this first one we define some gradients and color resources... good stuff Phil Intercepting INotifyPropertyChanged This is Robert Kozak's first post on this blog, but it's a good one about INotifyPropertyChanged and MVVM and has a solution in the post with lots of code and discussion. How do I Display Data of Complex Bound Criteria in Horizontal Lists in Silverlight? Kathleen Dollard's latest article in Visual Studio magazine is in answer to a question about displaying a list of complex bound criteria including data, child data, and photos, and displaying them horizontally one at a time. Very nice-looking result, and all the code. Windows Phone: Frame/Page navigation and transitions using the TransitioningContentControl Avi Pilosof discusses the built-in (boring) navigation on WP7, and then shows using the TransitionContentControl from the Toolkit to apply transitions to the navigation. EasyPainter: Cloud Turbulence and Particle Buzz Nokola returns with a couple more effects for EasyPainter: Cloud Turbulence and Particle Buzz ... check out the example screenshots, then go grab the code. Property change notifications for multithreaded Silverlight applications Jeff Wilcox is discussing the need for getting change notifications to always happen on the UI thread in multi-threaded apps... great diagrams to see what's going on. Tip: The default value of a DependencyProperty is shared by all instances of the class that registers it David Anson has a tip up about setting the default value of a DependencyProperty, and the consequence that may have depending upon the type. Building a “real” extension for Expression Blend Timmy Kokke's code is WPF, but the subject is near and dear to us all, Timmy has a real-world Expression Blend extension up... a search for controls in the Objects and Timelines pane ... and even if that doesn't interest you... it's the source to a Blend extension! XPath support in Silverlight 4 + XPathPad Tim Greenfield not only talks about XPath in SL4RC, but he has produced a tool, XPathPad, and provided the source... if you've used XPath, you either are a higher thinker than me(not a big stretch), or you need this :) Using a Service Locator to Work with MessageBoxes in an MVVM Application Josh Smith posted about a question that comes up a lot: showing a messagebox from a ViewModel object. This might not work for custom message boxes or unit testing. This post covers the Unit Testing aspect. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    Windows Phone MIX10

    Read the article

  • BUILD 2013 - Microsoft Set to Unveil It&rsquo;s Reinvention

    - by D'Arcy Lussier
    Originally posted on: http://geekswithblogs.net/dlussier/archive/2013/06/24/153211.aspxSome thoughts as we head into BUILD this week… This week in San Francisco Microsoft will be hosting the BUILD conference. They’ll be talking up Windows 8.1 (Windows Blue), more Azure, some Windows Phone, XBox, Office 365… actually, they told us on the original BUILD announcement site what we’d be seeing:           While looking at this, consider a recent article from The Verge that talks about the speculation of a huge shake up at Microsoft . From the article: All Things D quotes one insider as saying they're "titanic" changes, noting they might be attached to Ballmer's legacy at the company. "It’s the first time in a long time that it feels like that there will be some major shifts, including some departures," says the alleged insider. Considering Ballmer let Sinofsky go right after the Windows 8 launch, the idea of Microsoft cutting loose some executives doesn’t seem to be big news. But the next piece of the article frames things more interestingly: Ballmer is reportedly considering a new structure that would create four separate divisions: enterprise business, hardware, applications and services, and an operating systems group. This statement got me thinking…what would this new structure look like? Below is one possibility: At a recent (this year or last year, I can’t recall which) Microsoft shareholder’s meeting, Ballmer made the statement that Microsoft is now a products and services company. At the time I don’t think I really let that statement sink in. Partially because I really liked the Microsoft of my professional youth – the one that was a software and platform company. In Canada, Microsoft has been pushing three platform areas: Lync, Azure, and SQL Server. I would expect those to change moving forward as Microsoft continues to look for Partners that will help them increase their Services revenue through solutions that incorporate/are based on Azure, Office 365, Lync, and Dynamics. I also wonder if we’re not seeing a culling of partners through changes to the Microsoft Partner Program. In addition to the changing certification requirements that align more to Microsoft’s goals (i.e. There is no desktop development based MCSD, only Windows 8 Store Apps), competencies that partners can qualify for are being merged, requirements changed, and licenses provided reduced. Ballmer warned as much at the last WPC though that they were looking for partners who were “all in” with Microsoft, and these programs seem to support that sentiment. Heading into BUILD this week, I’ll be looking to answer one question – what does it mean to be a Microsoft developer here in the 2010’s? What is the future of the Microsoft development platform? Sure, Visual Studio is still alive and well and Microsoft realizes that there’s a huge install base of .NET developers actively working on solutions. But they’ve ratcheted down the messaging around their development stack and instead focussed on promoting development for their platforms and services. Last year at BUILD with the release of Windows 8, Microsoft just breached the walls of its cocoon. After this BUILD and the organizational change announcements in July, we’ll see what Microsoft looks like fully emerged from its metamorphosis.

    Read the article

  • ASP.NET Web API - Screencast series Part 2: Getting Data

    - by Jon Galloway
    We're continuing a six part series on ASP.NET Web API that accompanies the getting started screencast series. This is an introductory screencast series that walks through from File / New Project to some more advanced scenarios like Custom Validation and Authorization. The screencast videos are all short (3-5 minutes) and the sample code for the series is both available for download and browsable online. I did the screencasts, but the samples were written by the ASP.NET Web API team. In Part 1 we looked at what ASP.NET Web API is, why you'd care, did the File / New Project thing, and did some basic HTTP testing using browser F12 developer tools. This second screencast starts to build out the Comments example - a JSON API that's accessed via jQuery. This sample uses a simple in-memory repository. At this early stage, the GET /api/values/ just returns an IEnumerable<Comment>. In part 4 we'll add on paging and filtering, and it gets more interesting.   The get by id (e.g. GET /api/values/5) case is a little more interesting. The method just returns a Comment if the Comment ID is valid, but if it's not found we throw an HttpResponseException with the correct HTTP status code (HTTP 404 Not Found). This is an important thing to get - HTTP defines common response status codes, so there's no need to implement any custom messaging here - we tell the requestor that the resource the requested wasn't there.  public Comment GetComment(int id) { Comment comment; if (!repository.TryGet(id, out comment)) throw new HttpResponseException(HttpStatusCode.NotFound); return comment; } This is great because it's standard, and any client should know how to handle it. There's no need to invent custom messaging here, and we can talk to any client that understands HTTP - not just jQuery, and not just browsers. But it's crazy easy to consume an HTTP API that returns JSON via jQuery. The example uses Knockout to bind the JSON values to HTML elements, but the thing to notice is that calling into this /api/coments is really simple, and the return from the $.get() method is just JSON data, which is really easy to work with in JavaScript (since JSON stands for JavaScript Object Notation and is the native serialization format in Javascript). $(function() { $("#getComments").click(function () { // We're using a Knockout model. This clears out the existing comments. viewModel.comments([]); $.get('/api/comments', function (data) { // Update the Knockout model (and thus the UI) with the comments received back // from the Web API call. viewModel.comments(data); }); }); }); That's it! Easy, huh? In Part 3, we'll start modifying data on the server using POST and DELETE.

    Read the article

  • Tulsa Azure Boot Camp

    - by dmccollough
    Windows Azure Boot Camp presented by HyperVize & TulsaTech When: Thursday July 1st and Friday July 2nd Registration: Click here. Where: TulsaTech Riverside Campus 801 East 91st Street Tulsa, Ok 74132-4008 Click here for a map. Summary Tulsa Windows Azure Boot Camp is a comprehensive 2 day training program for members of the development community in Tulsa Oklahoma. At the conclusion of this program, the attendees should have a deep understanding of Azure, BPOS, and advanced development techniques for both platforms. Who should attend: Web Developers, Backend Developers, SQL DBAs, Consultants, & IT Leaders who are interested in using Azure for development, data storage, or processing. Both days is suggested, but if you can't attend both days, contact us for a special one day pass. Schedule: Day one of the training sessions will be held from July 1st 2010 between the hours of 9AM and 4:30PM. Topics covered on day 1: Azure Basics, Web Development, & Data Storage. Day two of the training sessions will be held from July 2nd 2010 between the hours of 9AM and 4:30PM. Topics covered on day 2: Architecture, Business Value, SOA Development, SQL Azure, & Advanced Development. Pre-requisites: If you want to stay up to speed on the Windows Azure Labs you will need to install the tools and updates listed on the Windows Azure Boot Camp website: http://windowsazurebootcamp.com/whattobring Boot Camp Agenda Day 1 – July1st 2010:  · 8:30 – 9:00 - Registration · 9:00 – 10:00 - Module 1: Intro to Azure & Cloud Computing · 10:00 – 11:00 - Module 2: Using Web Roles · 11:00 – Noon - Lab 1 & workstation configuration · Noon – 1:00 - Lunch · 1:00 – 2:00 - Module 3: Blobs · 2:00 – 3:00 - Module 4: Tables · 3:00 – 4:00 - Module 5: Queues · 4:00 – ? - Q&A / Open Discussion Day 2 – July 2nd 2010: · 9:00 – 10:00 - Module 6: Building a business with Azure · 10:00 – 11:00 - Module 7: Cloud Scenarios · 11:00 – Noon - Module 8: SQL Azure · Noon – 1:00 - Lunch · 1:00 – 2:00 - Module 9: Basic Worker Roles · 2:00 – 3:00 - Module 10: Advanced Worker Roles · 3:00 – 4:00 - Module 11: Azure Diagnostics · 4:00 –    ??? - Module 12: App Fabric  

    Read the article

  • SQL SERVER – Index Created on View not Used Often – Observation of the View

    - by pinaldave
    I always enjoy writing about concepts on Views. Views are frequently used concepts, and so it’s not surprising that I have seen so many misconceptions about this subject. To clear such misconceptions, I have previously written the article SQL SERVER – The Limitations of the Views – Eleven and more…. I also wrote a follow up article wherein I demonstrated that without even creating index on the basic table, the query on the View will not use the View. You can read about this demonstration over here: SQL SERVER – Index Created on View not Used Often – Limitation of the View 12. I promised in that post that I would also write an article where I would demonstrate the condition where the Index will be used. I got many responses suggesting that I can do that with using NOEXPAND; I agree. I have already written about this in my original summary article. Here is a way for you to see how Index created on View can be utilized. We will do the following steps on this exercise: Create a Table Create a View Create Index On View Write SELECT with ORDER BY on View USE tempdb GO IF EXISTS (SELECT * FROM sys.views WHERE OBJECT_ID = OBJECT_ID(N'[dbo].[SampleView]')) DROP VIEW [dbo].[SampleView] GO IF EXISTS (SELECT * FROM sys.objects WHERE OBJECT_ID = OBJECT_ID(N'[dbo].[mySampleTable]') AND TYPE IN (N'U')) DROP TABLE [dbo].[mySampleTable] GO -- Create SampleTable CREATE TABLE mySampleTable (ID1 INT, ID2 INT, SomeData VARCHAR(100)) INSERT INTO mySampleTable (ID1,ID2,SomeData) SELECT TOP 100000 ROW_NUMBER() OVER (ORDER BY o1.name), ROW_NUMBER() OVER (ORDER BY o2.name), o2.name FROM sys.all_objects o1 CROSS JOIN sys.all_objects o2 GO -- Create View CREATE VIEW SampleView WITH SCHEMABINDING AS SELECT ID1,ID2,SomeData FROM dbo.mySampleTable GO -- Create Index on View CREATE UNIQUE CLUSTERED INDEX [IX_ViewSample] ON [dbo].[SampleView] ( ID2 ASC ) GO -- Select from view SELECT ID1,ID2,SomeData FROM SampleView ORDER BY ID2 GO When we check the execution plan for this , we find it clearly that the Index created on the View is utilized. ORDER BY clause uses the Index created on the View. I hope this makes the puzzle simpler on how the Index is used on the View. Again, I strongly recommend reading my earlier series about the limitations of the Views found here: SQL SERVER – The Limitations of the Views – Eleven and more…. Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Optimization, SQL Performance, SQL Query, SQL Scripts, SQL Server, SQL Tips and Tricks, SQL View, T SQL, Technology

    Read the article

  • Silverlight Cream for March 05, 2010 -- #807

    - by Dave Campbell
    In this Issue: Phil Middlemiss(-2-, -3-), Pencho Popadiyn, John Papa(-2-, -3-), Jim Lynn, and SilverLaw(-2-). Shoutouts: Walt Ritscher has added more shaders and features: Shazzam 1.2 – Feature Overview I hope you're getting as excited as I am about MIX10. You should be reading MIX10 News and checking out the sessions and the directory of attendees. From SilverlightCream.com: Watermarked TextBox Part I Phil Middlemiss's Orb Radio Button hit number two in the Silverlight Cream Skim page, in 2 days... now Phil has a very nice 3-part tutorial up on creating a Watermarked TextBox with lots of cool features. This is part 1 and starts the series off. Watermarked TextBox Part II In Phil Middlemiss's Part II of the Watermarked TextBox tutorial, he's concentrating on visual elements of the control began in the last episode... you're paying attention, right? ... this is a cool control :) Watermarked Textbox Part III In the final part of Phil Middlemiss's tutorial series, he's wiring all the pieces together in the UserControl. Go grab the control, then leave Phil some love on his blog! Using Reactive Extensions in Silverlight Pencho Popadiyn has a great tutorial up on SilverlightShow about Rx ... if you want to get your arms around this... this tutorial is a good place to begin. Silverlight TV 10: Silverlight Hyper Video Platform with Jesse Liberty Running a little behind here, but check out John Papa and THE Silverlight GeekTM Jesse Liberty discussing Jesse's Hyper Video Platform on Silverlight TV Silverlight TV 11: Dynamically Loading XAPs with MEF In Silverlight TV episode 11, John Papa talks to Glenn Block about MEF and partitioning and dynamically loading XAPs ... good stuff. Silverlight TV 12: The Best Blend 3 Video Ever! And the latest Silverlight TV episode, number 12, has John Papa and Adam Kinney giving "The Best Blend 3 Video ever (or at least on Silverlight TV)"... check out the list of topics and you'll want to watch :) InvalidOperation_EnumFailedVersion when binding data to a Silverlight Chart Read Jim Lynn's post about a problem found while deploying his app, the very confusing (long) error, and the workaround. Leather Stamped Style Series For Silverlight Controls - Part 1 SilverLaw contued after his 'leather stamped' textbox and has added TextBlock, Button and some template bindings... check it out then get it at the Expression Gallery Circular Accordion Style Silverlight 3 SilverLaw also built a Circualar Accordian style... interesting idea and once again it, in the Expression Gallery. He's also looking for feedback. Stay in the 'Light! Twitter SilverlightNews | Twitter WynApse | WynApse.com | Tagged Posts | SilverlightCream Join me @ SilverlightCream | Phoenix Silverlight User Group Technorati Tags: Silverlight    Silverlight 3    Silverlight 4    MIX10

    Read the article

  • Translate report data export from RUEI into HTML for import into OpenOffice Calc Spreadsheets

    - by [email protected]
    A common question of users is, How to import the data from the automated data export of Real User Experience Insight (RUEI) into tools for archiving, dashboarding or combination with other sets of data.XML is well-suited for such a translation via the companion Extensible Stylesheet Language Transformations (XSLT). Basically XSLT utilizes XSL, a template on what to read from your input XML data file and where to place it into the target document. The target document can be anything you like, i.e. XHTML, CSV, or even a OpenOffice Spreadsheet, etc. as long as it is a plain text format.XML 2 OpenOffice.org SpreadsheetFor the XSLT to work as an OpenOffice.org Calc Import Filter:How to add an XML Import Filter to OpenOffice CalcStart OpenOffice.org Calc andselect Tools > XML Filter SettingsNew...Fill in the details as follows:Filter name: RUEI Import filterApplication: OpenOffice.org Calc (.ods)Name of file type: Oracle Real User Experience InsightFile extension: xmlSwitch to the transformation tab and enter/select the following leaving the rest untouchedXSLT for import: ruei_report_data_import_filter.xslPlease see at the end of this blog post for a download of the referenced file.Select RUEI Import filter from list and Test XSLTClick on Browse to selectTransform file: export.php.xmlOpenOffice.org Calc will transform and load the XML file you retrieved from RUEI in a human-readable format.You can now select File > Open... and change the filetype to open your RUEI exports directly in OpenOffice.org Calc, just like any other a native Spreadsheet format.Files of type: Oracle Real User Experience Insight (*.xml)File name: export.php.xml XML 2 XHTMLMost XML-powered browsers provides for inherent XSL Transformation capabilities, you only have to reference the XSLT Stylesheet in the head of your XML file. Then open the file in your favourite Web Browser, Firefox, Opera, Safari or Internet Explorer alike.<?xml version="1.0" encoding="ISO-8859-1"?><!-- inserted line below --> <?xml-stylesheet type="text/xsl" href="ruei_report_data_export_2_xhtml.xsl"?><!-- inserted line above --><report>You can find a patched example export from RUEI plus the above referenced XSL-Stylesheets here: export.php.xml - Example report data export from RUEI ruei_report_data_export_2_xhtml.xsl - RUEI to XHTML XSL Transformation Stylesheetruei_report_data_import_filter.xsl - OpenOffice.org XML import filter for RUEI report export data If you would like to do things like this on the command line you can use either Xalan or xsltproc.The basic command syntax for xsltproc is very simple:xsltproc -o output.file stylesheet.xslt inputfile.xmlYou can use this with the above two stylesheets to translate RUEI Data Exports into XHTML and/or OpenOffice.org Calc ODS-Format. Or you could write your own XSLT to transform into Comma separated Value lists.Please let me know what you think or do with this information in the comments below.Kind regards,Stefan ThiemeReferences used:OpenOffice XML Filter - Create XSLT filters for import and export - http://user.services.openoffice.org/en/forum/viewtopic.php?f=45&t=3490SUN OpenOffice.org XML File Format 1.0 - http://xml.openoffice.org/xml_specification.pdf

    Read the article

  • Qml and QfileSystemModel interaction problem

    - by user136432
    I'm having some problem in realizing an interaction between QML and C++ to obtain a very basic file browser that is shown within a ListView. I tried to use as model for my data the QT class QFileSystemModel, but it did't work as I expected, probably I didn't fully understand the QT class documentation about the use of this model or the example I found on the internet. Here is the code that I am using: File main.cpp #include <QModelIndex> #include <QFileSystemModel> #include <QQmlContext> #include <QApplication> #include "qtquick2applicationviewer.h" int main(int argc, char *argv[]) { QApplication app(argc, argv); QFileSystemModel* model = new QFileSystemModel; model->setRootPath("C:/"); model->setFilter(QDir::Files | QDir::AllDirs); QtQuick2ApplicationViewer viewer; // Make QFileSystemModel* available for QML use. viewer.rootContext()->setContextProperty("myFileModel", model); viewer.setMainQmlFile(QStringLiteral("qml/ProvaQML/main.qml")); viewer.showExpanded(); return app.exec(); } File main.qml Rectangle { id: main width: 800 height: 600 ListView { id: view property string root_path: "C:/Users" x: 40 y: 20 width: parent.width - (2*x) height: parent.height - (2*y) VisualDataModel { id: myVisualModel model: myFileModel // Get the model QFileSystemModel exposed from C++ delegate { Rectangle { width: 210; height: 20; radius: 5; border.width: 2; border.color: "orange"; color: "yellow"; Text { text: fileName; x: parent.x + 10; } MouseArea { anchors.fill: parent onDoubleClicked: { myVisualModel.rootIndex = myVisualModel.modelIndex(index) } } } } } highlight: Rectangle { color: "lightsteelblue"; radius: 5 } focus: true } } The first problem with this code is that first elements that I can see within my list are my PC logical drives even if I set a specific path. Then when I first double click on drive "C:\" it shows the list of files and directories on that path, but when I double click on a directory a second time the screen flickers for one moment and then it shows again the PC logical drives. Can anyone tell me how should I use the QFileSystemModel class with a ListView QML object? Thanks in advance! Carlo

    Read the article

  • Android OpenGL ES 2 framebuffer not working properly

    - by user16547
    I'm trying to understand how framebuffers work. In order to achieve that, I want to draw a very basic triangle to a framebuffer texture and then draw the resulting texture to a quad on the default framebuffer. However, I only get a fraction of the triangle like below. LE: The triangle's coordinates should be (1) -0.5f, -0.5f, 0 (2) 0.5f, -0.5f, 0 (3) 0, 0.5f, 0 Here's the code to render: @Override public void onDrawFrame(GL10 gl) { renderNormalStuff(); renderFramebufferTexture(); } protected void renderNormalStuff() { GLES20.glViewport(0, 0, texWidth, texHeight); GLUtils.updateProjectionMatrix(mProjectionMatrix, texWidth, texHeight); GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, fbo[0]); GLES20.glUseProgram(mProgram); GLES20.glClearColor(.5f, .5f, .5f, 1); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_COLOR_BUFFER_BIT); Matrix.setIdentityM(mModelMatrix, 0); Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0); Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0); GLES20.glUniformMatrix4fv(u_MVPMatrix, 1, false, mMVPMatrix, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, vbo[0]); GLES20.glVertexAttribPointer(a_Position, 3, GLES20.GL_FLOAT, false, 12, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, vbo[1]); GLES20.glVertexAttribPointer(a_Color, 4, GLES20.GL_FLOAT, false, 16, 0); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, ibo[0]); GLES20.glDrawElements(GLES20.GL_TRIANGLES, indexBuffer.capacity(), GLES20.GL_UNSIGNED_BYTE, 0); GLES20.glBindBuffer(GLES20.GL_ARRAY_BUFFER, 0); GLES20.glBindBuffer(GLES20.GL_ELEMENT_ARRAY_BUFFER, 0); GLES20.glUseProgram(0); } private void renderFramebufferTexture() { GLES20.glBindFramebuffer(GLES20.GL_FRAMEBUFFER, 0); GLES20.glUseProgram(fboProgram); GLES20.glClearColor(.0f, .5f, .25f, 1); GLES20.glClear(GLES20.GL_COLOR_BUFFER_BIT | GLES20.GL_DEPTH_BUFFER_BIT); GLES20.glViewport(0, 0, width, height); GLUtils.updateProjectionMatrix(mProjectionMatrix, width, height); Matrix.setIdentityM(mModelMatrix, 0); Matrix.multiplyMM(mMVPMatrix, 0, mViewMatrix, 0, mModelMatrix, 0); Matrix.multiplyMM(mMVPMatrix, 0, mProjectionMatrix, 0, mMVPMatrix, 0); GLES20.glUniformMatrix4fv(fbo_u_MVPMatrix, 1, false, mMVPMatrix, 0); //draw the texture GLES20.glActiveTexture(GLES20.GL_TEXTURE0); GLES20.glBindTexture(GLES20.GL_TEXTURE_2D, texture[0]); GLES20.glUniform1i(fbo_u_Texture, 0); GLUtils.sendBufferData(fbo_a_Position, 3, quadPositionBuffer); GLUtils.sendBufferData(fbo_a_TexCoordinate, 2, quadTexCoordinate); GLES20.glDrawElements(GLES20.GL_TRIANGLES, quadIndexBuffer.capacity(), GLES20.GL_UNSIGNED_BYTE, quadIndexBuffer); GLES20.glUseProgram(0); }

    Read the article

  • Upcoming UPK Events

    - by kathryn.lustenberger(at)oracle.com
    February 15th: UPK: Follow Panduit's Lead and Leverage Oracle's User Productivity Kit To Achieve Your Goals - Join us for a live webcast to learn how Oracle's User Productivity Kit can help you meet and exceed your goals. The webcast will feature Jim Boss, from the Panduit Corporation, who will share how Oracle's User Productivity Kit was used with both Oracle and Non-Oracle applications to helped Panduit to meet their goals. Date: February 15th, 2011 at 12:00 PST / 3:00 EST Evite: http://www.oracle.com/us/dm/65630-naod10046029mpp005c010-se-300908.html March 2nd: Synaptis teams with Oracle to deliver a UPK customer success story - Webinar Offering The Value of UPK (Customer Success Story): How to leverage the value of UPK to streamline processes and maximize end user adoption for a global implementation Join us to learn how the power of UPK can be leveraged to train end users globally in a successful and cost effective manner. A valued Oracle UPK customer will share experiences, successes, challenges, and strategies. The webinar will also include a question and answer session to give the attendees an opportunity to interact directly with the Oracle UPK customer, Synaptis, and the Oracle UPK Team. Date: March 2, 2011 Time: 11:00am - 12:00pm EST Register for this webinar March 27 - 30th: The Alliance 2011 conference is an annual event for all higher education, government, and public sector users of Oracle applications. The Alliance conference is organized and managed by the Higher Education User Group (www.heug.org). This is the 14th annual event for the HEUG. This is your opportunity to join with over 3200 other Higher Education, Federal, State and Local Government users to network, learn and share in our amazing combined experiences. The Alliance conference team is hard at work, putting together the best conference ever for 2011 - so don't delay, make your plans now to be part of Alliance 2011! When: Sunday, March 27th, 2011 - Wednesday, March 30, 2011 Where: The Colorado Convention Center (Denver, Colorado) Registration for Alliance 2011 is Now Open! UPK will be represented at this event offering: Pre-Conference Training Learn the Basics of Oracle User Productivity Kit (UPK) Taking Your UPKs to a Whole New Level, Advanced Use of UPK Demo Pod Staff Sessions: Oracle User Productivity Kit: Creating Value throughout the Project Lifecycle Beyond Basic UPK -- User Tracking and SmartHelp Leveraging Oracle and User Productivity Kit (UPK) to Develop a Comprehensive Training Program Oracle User Productivity Kit Strategy and Roadmap -- Key to User Adoption April 10 - 14th: Registration for COLLABORATE 11 has begun - Don't miss the most comprehensive, user-driven conference devoted to Oracle applications and technology. Collaborate with a global network of more than 5,000 peers and experts to share real-world experiences, solve your challenges and gain insights to validate your technology plans. Read below to discover which group to register with for the best value. UPK will be represented at this event offering: Demo Pod Staff Sessions: Oracle User Productivity Kit: Creating Value throughout the Project Lifecycle Centralize all Project Team assets, AND, Deploy Fully Measurable Training with UPK Pro Oracle User Productivity Kit Strategy and Roadmap - Key to User Adoption Registration is Now Open!

    Read the article

  • Best of "The Moth" 2010

    - by Daniel Moth
    It is the time again (like in 2004, 2005, 2006, 2007, 2008, 2009) to look back at my blog for the past year and identify areas of interest that seem to be more prominent than others. After doing so, representative posts follow in my top 5 list (in random order). 1. This was the year where I had to move for the first time since 2004 my blog engine (blogger.com –> dasBlog), host provider (zen –> godaddy), web server technology and OS (apache on Linux –> IIS on Windows Server). My goal was not to break any permalinks or the look and feel of this website. A series of posts covered how I achieved that goal, culminating in a tool for others to use if they wanted to do the same: Tool to convert blogger.com content to dasBlog. Going forward I aim to be sharing more small code utilities like that one… 2. At work I am known for being fairly responsive on email, and more importantly never dropping email balls on the floor. This is due to my email processing system, which I shared here: Processing Email in Outlook. I will be sharing more tips with regards to making the best of the Office products. 3. There is no doubt in my mind that this is the year people will remember as the one where Microsoft finally fights back in the mobile space. Even though the new platform means my Windows Mobile book sales will dwindle :-), I am ecstatic about Windows Phone 7 both as a consumer and as a developer. On the release day, to get you started I shared the top 10 Windows Phone 7 developer resources. I will be sharing my tips from my experience in writing code for and consuming this new platform… 4. For my HPC developer friends using Visual Studio, I shared Slides and code for MPI Cluster Debugger and also gave you all the links you need for getting started with Dryad and DryadLINQ from MSR. Expect more from me on cluster development in the coming year… 5. Still in the HPC space, but actually also in the game and even mainstream development, the big disruption and opportunity comes in the form of GPGPU and, on the Microsoft platform, (currently) DirectCompute. Expect more from me on gpgpu development in the coming year… Subscribe via the link on the left to stay tuned for 2011… I wish you a very Happy New Year (with whatever definition of happiness works for you)! Comments about this post welcome at the original blog.

    Read the article

  • SQL SERVER – Challenge – Puzzle – Usage of FAST Hint

    - by pinaldave
    I was recently working with various SQL Server Hints. After working for a day on various hints, I realize that for one hint, I am not able to come up with good example. The hint is FAST. Let us look at the definition of the FAST hint from the Book On-Line. FAST number_rows Specifies that the query is optimized for fast retrieval of the first number_rows. This is a nonnegative integer. After the first number_rows are returned, the query continues execution and produces its full result set. Now the question is in what condition this hint can be useful. I have tried so many different combination, I have found this hint does not make much performance difference, infect I did not notice any change in time taken to load the resultset. I noticed that this hint does not change number of the page read to return result. Now when there is difference in performance is expected because if you read the what FAST hint does is that it only returns first few results FAST – which does not mean there will be difference in performance. I also understand that this hint gives the guidance/suggestions/hint to query optimizer that there are only 100 rows are in expected resultset. This tricking the optimizer to think there are only 100 rows and which (may) lead to render different execution plan than the one which it would have taken in normal case (without hint). Again, not necessarily, this will happen always. Now if you read above discussion, you will find that basic understanding of the hint is very clear to me but I still feel that I am missing something. Here are my questions: 1) In what condition this hint can be useful? What is the case, when someone want to see first few rows early because my experience suggests that when first few rows are rendered remaining rows are rendered as well. 2) Is there any way application can retrieve the fast fetched rows from SQL Server? 3) Do you use this hint in your application? Why? When? and How? Here are few examples I have attempted during the my experiment and found there is no difference in execution plan except its estimated number of rows are different leading optimizer think that the cost is less but in reality that is not the case. USE AdventureWorks GO SET STATISTICS IO ON SET STATISTICS TIME ON GO --------------------------------------------- -- Table Scan with Fast Hint SELECT * FROM Sales.SalesOrderDetail GO SELECT * FROM Sales.SalesOrderDetail OPTION (FAST 100) GO --------------------------------------------- -- Table Scan with Where on Index Key SELECT * FROM Sales.SalesOrderDetail WHERE OrderQty = 14 GO SELECT * FROM Sales.SalesOrderDetail WHERE OrderQty = 14 OPTION (FAST 100) GO --------------------------------------------- -- Table Scan with Where on Index Key SELECT * FROM Sales.SalesOrderDetail WHERE SalesOrderDetailID < 1000 GO SELECT * FROM Sales.SalesOrderDetail WHERE SalesOrderDetailID < 1000 OPTION (FAST 100) GO Reference: Pinal Dave (http://blog.sqlauthority.com) Filed under: Pinal Dave, SQL, SQL Authority, SQL Puzzle, SQL Query, SQL Server, SQL Tips and Tricks, T SQL, Technology

    Read the article

  • What’s new in Silverlight 4 RC?

    - by pluginbaby
    I am here in Las Vegas for MIX10 where Scott Guthrie announced today the release of Silverlight 4 RC and the Visual Studio 2010 tools. You can now install VS2010 RC!!! As always, downloads links are here: www.silverlight.net He also said that the final version of Silverlight 4 will come next month (so april)! 4 months ago, I wrote a blog post on the new features of Silverlight 4 beta, so… what’s new in the RC ?   Rich Text · RichTextArea renamed to RichTextBox · Text position and selection APIs · “Xaml” property for serializing text content · XAML clipboard format · FlowDirection support on Runs tag · “Format then type” support when dragging controls to the designer · Thai/Vietnamese/Indic support · UI Automation Text pattern   Networking · UploadProgress support (Client stack) · Caching support (Client stack) · Sockets security restrictions removal (Elevated Trust) · Sockets policy file retrieval via HTTP · Accept-Language header   Out of Browser (Elevated Trust) · XAP signing · Silent install and emulation mode · Custom window chrome · Better support for COM Automation · Cancellable shutdown event · Updated security dialogs   Media · Pinned full-screen mode on secondary display · Webcam/Mic configuration preview · More descriptive MediaSourceStream errors · Content & Output protection updates · Updates to H.264 content protection (ClearNAL) · Digital Constraint Token · CGMS-A · Multicast · Graphics card driver validation & revocation   Graphics and Printing · HW accelerated Perspective Transforms · Ability to query page size and printable area · Memory usage and perf improvements   Data · Entity-level validation support of INotifyDataErrorInfo for DataGrid · XPath support for XML   Parser · New architecture enables future innovation · Performance and stability improvements · XmlnsPrefix & XmlnsDefinition attributes · Support setting order-dependent properties   Globalization & Localization · Support for 31 new languages · Arabic, Hebrew and Thai input on Mac · Indic support   More … · Update to DeepZoom code base with HW acceleration · Support for Private mode browsing · Google Chrome support (Windows) · FrameworkElement.Unloaded event · HTML Hosting accessibility · IsoStore perf improvements · Native hosting perf improvements (e.g., Bing Toolbar) · Consistency with Silverlight for Mobile APIs and Tooling · SDK   - System.Numerics.dll   - Dynamic XAP support (MEF)   - Frame/Navigation refresh support   That’s a lot!   You will find more details on the following links: http://timheuer.com/blog/archive/2010/03/15/whats-new-in-silverlight-4-rc-mix10.aspx http://www.davidpoll.com/2010/03/15/new-in-the-silverlight-4-rc-xaml-features/   Technorati Tags: Silverlight

    Read the article

  • AspNetCompatibility in WCF Services &ndash; easy to trip up

    - by Rick Strahl
    This isn’t the first time I’ve hit this particular wall: I’m creating a WCF REST service for AJAX callbacks and using the WebScriptServiceHostFactory host factory in the service: <%@ ServiceHost Language="C#" Service="WcfAjax.BasicWcfService" CodeBehind="BasicWcfService.cs" Factory="System.ServiceModel.Activation.WebScriptServiceHostFactory" %>   to avoid all configuration. Because of the Factory that creates the ASP.NET Ajax compatible format via the custom factory implementation I can then remove all of the configuration settings that typically get dumped into the web.config file. However, I do want ASP.NET compatibility so I still leave in: <system.serviceModel> <serviceHostingEnvironment aspNetCompatibilityEnabled="true"/> </system.serviceModel> in the web.config file. This option allows you access to the HttpContext.Current object to effectively give you access to most of the standard ASP.NET request and response features. This is not recommended as a primary practice but it can be useful in some scenarios and in backwards compatibility scenerios with ASP.NET AJAX Web Services. Now, here’s where things get funky. Assuming you have the setting in web.config, If you now declare a service like this: [ServiceContract(Namespace = "DevConnections")] #if DEBUG [ServiceBehavior(IncludeExceptionDetailInFaults = true)] #endif public class BasicWcfService (or by using an interface that defines the service contract) you’ll find that the service will not work when an AJAX call is made against it. You’ll get a 500 error and a System.ServiceModel.ServiceActivationException System error. Worse even with the IncludeExceptionDetailInFaults enabled you get absolutely no indication from WCF what the problem is. So what’s the problem?  The issue is that once you specify aspNetCompatibilityEnabled=”true” in the configuration you *have to* specify the AspNetCompatibilityRequirements attribute and one of the modes that enables or at least allows for it. You need either Required or Allow: [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.Required)] without it the service will simply fail without further warning. It will also fail if you set the attribute value to NotAllowed. The following also causes the service to fail as above: [AspNetCompatibilityRequirements(RequirementsMode = AspNetCompatibilityRequirementsMode.NotAllowed)] This is not totally unreasonable but it’s a difficult issue to debug especially since the configuration setting is global – if you have more than one service and one requires traditional ASP.NET access and one doesn’t then both must have the attribute specified. This is one reason why you’d want to avoid using this functionality unless absolutely necessary. WCF REST provides some basic access to some of the HTTP features after all, although what’s there is severely limited. I also wish that ServiceActivation errors would provide more error information. Getting an Activation error without further info on what actually is wrong is pretty worthless especially when it is a technicality like a mismatched configuration/attribute setting like this.© Rick Strahl, West Wind Technologies, 2005-2010Posted in ASP.NET  WCF  AJAX  

    Read the article

< Previous Page | 741 742 743 744 745 746 747 748 749 750 751 752  | Next Page >