Search Results

Search found 1295 results on 52 pages for 'frank farm'.

Page 5/52 | < Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >

  • How to rdc to a particular machine that is member of a TS Farm?

    - by Amit Arora
    I created a Terminal Services farm comprising of 3 TS hosts (say, TS1, TS2 and TS3) running Windows 2008 R2 Enterprise, a TS Connection broker and a TS Gateway for the purpose of hosting a windows application as a TS RemoteApp. The setup works just fine. Now, I want to do some further configuration changes on a particular TS host, say TS2 and not on any other TS host. I try to rdc to TS2 but I find myself getting connected to a randomly chosen TS host (sometimes TS1, sometimes TS2, and at other times, TS3). I think rdc connection is also going via the Connection Broker that is forwarding me to a TS host it decides is best. Is there a way I can deterministically connect to a particular TS host using rdc? I don't have option to login locally on a TS host as the entire setup is hosted in a remote data center. I think this is a very common scenario and must have a straight forward solution. It could be as easy as doing rdc to Connection Broker server and disabling it for a while, but I don't know how to do that too. Any help will be highly appreciated.

    Read the article

  • Creating a Jenkins build farm in a hands-off manner?

    - by user183394
    My colleague and I have set up and run Jenkins on a KVM guest running Ubuntu 12.04 with good results for a while now. We are thinking about deploying a cluster of Jenkins CI hosts in master/slave configuration, with the libvirt slave plugin to keep our hardware count low. Our environment is strictly Linux (CentOS, Scientific Linux, Fedora, and Ubuntu). Both of us are competent in setting up large clusters. We typically use tools like cobbler + a configuration management tool (Puppet, Chef, and alike) to set up a large number of machines (physical and/or virtual) hands off (hundreds of nodes in less than an hour typical). We would like to do the same for nodes running Jenkins. But the step by step guide doesn't give us any clues in this regard. I did see a Multi-slave config plugin. But, being used to dealing with hundreds or more machines completely hands-off, clicking the UI for many machines just doesn't feel right. Can someone point to us a reference that talks about how to set up large cluster of Jenkins CI hosts more in the hands-off way?

    Read the article

  • One vs. many domain user accounts in a server farm

    - by mjustin
    We are in a migration process of a group of related computers (Intranet servers, SQL, application servers of one application) to a new domain. In the past we used one domain user account for every computer (web1, web2, appserver1, appserver2, sql1, sqlbackup ...) to access central Windows resources like network shares. Every computer also has a local user account with the same name. I am not sure if this is necessary, or if it would be easier to configure and maintain to use one domain user account. Are there key advantages / disadvantages of having one single user account vs. dedicated accounts per computer for this group of background servers? If I am not wrong, one advantage besides easier administration of the user account could be that moving installed applications and services around between the computers does not require a check of the access rights anymore. (Except where IP addresses or ports are used)

    Read the article

  • HOWTO: Migrate SharePoint site from one farm to another?

    - by Ramiz Uddin
    Hi Everyone, I've a site deployed on a developed machine. The site was developed under WSS 3.0 which contains custom List, Features, Templates, Styles etc. What I've to do is to create a deployment package (setup) which I can give away to my client. I know about stsadm but I don't have the access of the production machine. Is there a way I can package all the dependencies in a single file (installation file) and run on the server which will include all the dependencies (including site content)? I've tried to experiment this with SharePoint Content Deployment Wizard. It all went well when Export the site but always fail to Import with the following message: [2/2/2010 3:43:25 PM]: Start Time: 2/2/2010 3:43:25 PM. [2/2/2010 3:43:25 PM]: Progress: Initializing Import. [2/2/2010 3:43:42 PM]: FatalError: Could not find WebTemplate #75805 with LCID 1033. at Microsoft.SharePoint.Deployment.ImportRequirementsManager.VerifyWebTemplate(SPRequirementObject reqObj) at Microsoft.SharePoint.Deployment.ImportRequirementsManager.Validate(SPRequirementObject reqObj) at Microsoft.SharePoint.Deployment.ImportRequirementsManager.DeserializeAndValidate() at Microsoft.SharePoint.Deployment.SPImport.VerifyRequirements() at Microsoft.SharePoint.Deployment.SPImport.Run() [2/2/2010 3:43:48 PM]: Progress: Import Completed. [2/2/2010 3:43:48 PM]: Finish Time: 2/2/2010 3:43:48 PM. [2/2/2010 3:43:48 PM]: Completed with 0 warnings. [2/2/2010 3:43:48 PM]: Completed with 1 errors. [2/2/2010 3:44:51 PM]: Start Time: 2/2/2010 3:44:51 PM. [2/2/2010 3:44:51 PM]: Progress: Initializing Import. [2/2/2010 3:45:08 PM]: FatalError: Could not find WebTemplate #75805 with LCID 1033. at Microsoft.SharePoint.Deployment.ImportRequirementsManager.VerifyWebTemplate(SPRequirementObject reqObj) at Microsoft.SharePoint.Deployment.ImportRequirementsManager.Validate(SPRequirementObject reqObj) at Microsoft.SharePoint.Deployment.ImportRequirementsManager.DeserializeAndValidate() at Microsoft.SharePoint.Deployment.SPImport.VerifyRequirements() at Microsoft.SharePoint.Deployment.SPImport.Run() [2/2/2010 3:45:14 PM]: Progress: Import Completed. [2/2/2010 3:45:14 PM]: Finish Time: 2/2/2010 3:45:14 PM. [2/2/2010 3:45:14 PM]: Completed with 0 warnings. [2/2/2010 3:45:14 PM]: Completed with 1 errors. [2/2/2010 3:56:17 PM]: Start Time: 2/2/2010 3:56:17 PM. [2/2/2010 3:56:17 PM]: Progress: Initializing Import. [2/2/2010 3:56:34 PM]: FatalError: Could not find WebTemplate #75805 with LCID 1033. at Microsoft.SharePoint.Deployment.ImportRequirementsManager.VerifyWebTemplate(SPRequirementObject reqObj) at Microsoft.SharePoint.Deployment.ImportRequirementsManager.Validate(SPRequirementObject reqObj) at Microsoft.SharePoint.Deployment.ImportRequirementsManager.DeserializeAndValidate() at Microsoft.SharePoint.Deployment.SPImport.VerifyRequirements() at Microsoft.SharePoint.Deployment.SPImport.Run() [2/2/2010 3:56:39 PM]: Progress: Import Completed. [2/2/2010 3:56:39 PM]: Finish Time: 2/2/2010 3:56:39 PM. [2/2/2010 3:56:39 PM]: Completed with 0 warnings. [2/2/2010 3:56:39 PM]: Completed with 1 errors. I actually couldn't find a good reference on how to use it. But, this software doesn't something I'm looking for which can create a simple deployment package (after that you don't need to do anything). I might not be correct but after two days of googling I think there is no such utility (freeware) that can create a simple package of a site and install on other farm without even need to configure anything before you run the installation package. You people might have an advise which can help me to look/think outside the box and get to the solution quickly instead adding more days working on the problem. Please, share only freewares. I can't afford to buy anything. I'm waiting to be surprised with a good share :) Have a good day! Thanks.

    Read the article

  • Not able to run discovery in XenApp 6.5

    - by BDAA
    I am installing a fresh Citrix farm. I installed XenApp 6.5 and configured while configuring the XenApp, added domain\ctxadmin user as the farm administrator. Now when i log into Administrator account into the XenApp server and run discovery it says This user account is not an administrator of this farm, or there was a problem contacting the data store. Check that the data store server for the Citrix XenApp farm is online, and verify that your account is configured and enabled as an administrator on the farm Then I tried to RDP into the XenApp server as ctxadmin user and now I get an error "The Desktop you are trying to open is currently unavailable. Contact you system administrator to confirm the correct settings are in place for your client connection" I believe starting from XenApp version 6.x, once XenApp is installed, then a Citrix Policy needs to be changed as given in http://support.citrix.com/article/CTX124745 But for changing the Citrix policy I need to log into AppCenter which I am not able to do so as I am not able to run discovery as given above. So I am caught up in an end-less loop. Any help would be greatly appreciated

    Read the article

  • PowerShell Script To Find Where SharePoint 2010 Features Are Activated

    - by Brian Jackett
    The script on this post will find where features are activated within your SharePoint 2010 farm.   Problem    Over the past few months I’ve gotten literally dozens of emails, blog comments, or personal requests from people asking “how do I find where a SharePoint feature has been activated?”  I wrote a script to find which features are installed on your farm almost 3 years ago.  There is also the Get-SPFeature PowerShell commandlet in SharePoint 2010.  The problem is that these only tell you if a feature is installed not where they have been activated.  This is especially important to know if you have multiple web applications, site collections, and /or sites.   Solution    The default call (no parameters) for Get-SPFeature will return all features in the farm.  Many of the parameter sets accept filters for specific scopes such as web application, site collection, and site.  If those are supplied then only the enabled / activated features are returned for that filtered scope.  Taking the concept of recursively traversing a SharePoint farm and merging that with calls to Get-SPFeature at all levels of the farm you can find out what features are activated at that level.  Store the results into a variable and you end up with all features that are activated at every level.    Below is the script I came up with (slight edits for posting on blog).  With no parameters the function lists all features activated at all scopes.  If you provide an Identity parameter you will find where a specific feature is activated.  Note that the display name for a feature you see in the SharePoint UI rarely matches the “internal” display name.  I would recommend using the feature id instead.  You can download a full copy of the script by clicking on the link below.    Note: This script is not optimized for medium to large farms.  In my testing it took 1-3 minutes to recurse through my demo environment.  This script is provided as-is with no warranty.  Run this in a smaller dev / test environment first.   001 002 003 004 005 006 007 008 009 010 011 012 013 014 015 016 017 018 019 020 021 022 023 024 025 026 027 028 029 030 031 032 033 034 035 036 037 038 039 040 041 042 043 044 045 046 047 048 049 050 051 052 053 054 055 056 057 058 059 060 061 062 063 064 065 066 067 068 function Get-SPFeatureActivated { # see full script for help info, removed for formatting [CmdletBinding()] param(   [Parameter(position = 1, valueFromPipeline=$true)]   [Microsoft.SharePoint.PowerShell.SPFeatureDefinitionPipeBind]   $Identity )#end param   Begin   {     # declare empty array to hold results. Will add custom member `     # for Url to show where activated at on objects returned from Get-SPFeature.     $results = @()         $params = @{}   }   Process   {     if([string]::IsNullOrEmpty($Identity) -eq $false)     {       $params = @{Identity = $Identity             ErrorAction = "SilentlyContinue"       }     }       # check farm features     $results += (Get-SPFeature -Farm -Limit All @params |              % {Add-Member -InputObject $_ -MemberType noteproperty `                 -Name Url -Value ([string]::Empty) -PassThru} |              Select-Object -Property Scope, DisplayName, Id, Url)     # check web application features     foreach($webApp in (Get-SPWebApplication))     {       $results += (Get-SPFeature -WebApplication $webApp -Limit All @params |                % {Add-Member -InputObject $_ -MemberType noteproperty `                   -Name Url -Value $webApp.Url -PassThru} |                Select-Object -Property Scope, DisplayName, Id, Url)       # check site collection features in current web app       foreach($site in ($webApp.Sites))       {         $results += (Get-SPFeature -Site $site -Limit All @params |                  % {Add-Member -InputObject $_ -MemberType noteproperty `                     -Name Url -Value $site.Url -PassThru} |                  Select-Object -Property Scope, DisplayName, Id, Url)                          $site.Dispose()         # check site features in current site collection         foreach($web in ($site.AllWebs))         {           $results += (Get-SPFeature -Web $web -Limit All @params |                    % {Add-Member -InputObject $_ -MemberType noteproperty `                       -Name Url -Value $web.Url -PassThru} |                    Select-Object -Property Scope, DisplayName, Id, Url)           $web.Dispose()         }       }     }   }   End   {     $results   } } #end Get-SPFeatureActivated   Snippet of output from Get-SPFeatureActivated   Conclusion    This script has been requested for a long time and I’m glad to finally getting a working “clean” version.  If you find any bugs or issues with the script please let me know.  I’ll be posting this to the TechNet Script Center after some internal review.  Enjoy the script and I hope it helps with your admin / developer needs.         -Frog Out

    Read the article

  • AgroSense in Java Magazine November/December 2012

    - by Geertjan
    AgroSense, the Duke's Choice Award winning open source farm management system from the Netherlands, is featured in the hot-off-the-presses latest edition of the always awesome Java Magazine (November/December 2012): Read the whole article after subscribing for free to the magazine, via clicking the image above or by clicking this link. Note: If you're reading this and your sofware organization is doing anything at all that relates to farm management, consider porting your software to an AgroSense plugin. That would save you an immense amount of time and your users will get a comprehensive farm management system out of the box. Don't reinvent the wheel: create your farm management software on top of the AgroSense Platform!

    Read the article

  • Handling very large lists of objects without paging?

    - by user246114
    Hi, I have a class which can contain many small elements in a list. Looks like: public class Farm { private ArrayList<Horse> mHorses; } just wondering what will happen if the mHorses array grew to something crazy like 15,000 elements. I'm assuming that trying to write and read this from the datastore would be crazy, because I'd get killed on the serialization process. It's important that I can get the entire array in one shot without paging, and each Horse element may only have two string properties in it, so they are pretty lightweight: public class Horse { private String mId; private String mName; } I don't need these horses indexed at all. Does it sound reasonable to just store the mHorse array as a raw Text field, and force my clients to do the deserialization? Something like: public class Farm { private Text mHorsesSerialized; } then whenever the client receives a Farm instance, it has to take the raw string of horses, and split it in order to reinstantiate the list, something like: // GWT client perhaps Farm farm = rpcCall.getMyFarm(); String horsesSerialized = farm.getHorses(); String[] horseBlocks = horsesSerialized.split(","); for (int i = 0; i < horseBlocks.length; i++) { // .. continue deserializing the individual objects ... } yeah... so hopefully it would be quick to read a Farm instance from the datastore, and the serialization penalty is paid by the client, Thanks

    Read the article

  • ORACLE Fusion Middleware Summer Camps in Lisbon: Includes Advanced ADF Training by Oracle Product Management

    - by Frank Nimphius
    From July 9th - JUly 13th 2012, Frank Nimphius and Grant Ronald from the JDeveloper and ADF Product Management team present an 4 1/2 day advanced ADF Training Lisbon for the EMEA Oracle SOA Community. The training runs in parallel to an advanced WebCenter training giving you a chance to network with your peers during breaks and lunch. See below announcement by the ORACLE EMEA SOA Community for details:For Specialized partners who are working on following projects & opportunities, we (Oracle) offer these advanced summer camps:    ADF 11g     WebCenter Portal     SOA Suite 11g     ADF for BPM Suite 11     WebCenter Sites 11g All training sessions will be from HQ product management and our PTS team. The sessions will take place in July in Lisbon Portugal and Munich Germany. . Participation is limited to two people per company and bootcamp. Registration is handled by first come first serve, please pay attention to the skill requirements, the pre-requisitions and the follow up! We will not accept people onto the training who do not match the criteria!Lisbon: Monday, July 9th 11:00AM - Friday July 13th 16:00 PM (Lisbon time) ADF 11g advanced training by Grant Ronald and Frank Nimphius WebCenter Portal advanced training by Stefan Krantz and Angelo Santagata Cost: Free of charge, cancelation or no-show fee 2.000€Bootcamps are limited to 20 persons first come first serveFor details and registration please visit Lisbon registration page

    Read the article

  • Cannot connect Ubunto to Windows pc either direction

    - by Frank A
    New to ubuntu and really struggling with this. I want to connect to my old windows XP Pc on our home network. Searching for solutions gets complicated as at one level you are told use Connect to Server. I set to Windows Share, type in server IP address.. ... I get "Failed to retrieve share list from server" Demo on Youtube worked with no problem. Other advice in ask ubuntu is you need to install samba. Did that but nothing seems to happen when I try and run it other than it asking for admin password. (How do you tell what is running on Ubuntu?) So I try the other direction Windows XP to Ubuntu. I made the ubuntu directory within home frank shared and tried various combination such as \ipaddress\home\frank but just "The folder you entered does not appear to be valid. Please choose another." My entire data only drive is shared in Windows and no problems accessing that from other Win XP boxes on our network There are no alerts in Windows firewall, Ubuntu Firestarter did block but changed that to allowed... or so I thought. In firestarter I had set up Inbound traffic policy 192.168.1.1/24. And since then it has added the ip address of the win pc twice. So, I am in a state of confusion not knowing whedre to turn next so thought Ask Ubuntu :)

    Read the article

  • Coordinating the logging output from a web app installed in a SharePoint farm.

    - by Kelly French
    We are deploying web parts to SharePoint 2007 and would like to include logging (log4net). The ideal solution would be to use a database appender to avoid the problems with knowing which actual server is executing the web part. This questions has been helpful: http://stackoverflow.com/questions/219668/sharepoint-and-log4net. I've got log4net working in a stand-alone web app using Visual Studio dev server using the web.config for the log4net settings and a file appender for the output. I'd like to transition to SharePoint and still use the log file output so I can make sure it's all working first, then change the config around to log to a database. Is this going to be too much trouble? How have other developers added log4.net into their solutions for SharePoint?

    Read the article

  • Book Review

    - by frank.buytendijk
    ... and in the series of videoblogs, here is number 3: reviewing a few really books I recently read. Access the videoblog here. More on www.youtube.com/frankbuytendijk. frank

    Read the article

  • Displaying topics/categories related ads

    - by Frank
    Is there any advertising company that allow webmasters to decide on general ad topics (such as "Entertainment", "Autos and vehicles", "Arts", etc.) to be displayed for a user visiting on a website? I know you can choose specific ad campaigns to show to users, or let the advertising company decide for you which ads to show. But I am looking for an option to ask for the advertsing company to show me ads based on categories/topics. Thanks. Frank

    Read the article

  • 301 Redirect adding incorrect extra segments to a url

    - by Pentland_web
    I need to 301 redirect one segment of a url to a new version of it. My aim is redirect www.domain.co.uk/farm/whatever/ to www.domain.co.uk/farm_cottages/whatever/ The rule I am using to do this is: RedirectMatch 301 ^/farm/ /farm_cottages/ However for some reason it partially works but appends ?/farm/ after farm_cottages/ in the final url ie. www.domain.co.uk/farm_cottages/?/farm/whatever/ Here is my entire rewrite rule set as I believe one of the rewrite rules could be interfering with the redirect rule. #redirect /farm/ to /farm_cottages/ RedirectMatch 301 ^/farm/ /farm_cottages/ <IfModule mod_rewrite.c> RewriteEngine on # no WWW to WWW RewriteCond %{HTTP_HOST} !^www.domain.co.uk$ [NC] RewriteRule ^(.*)$ http://www.domain.co.uk/$1 [R=301,L] # Force trailing slash on URLs RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_URI} !(.*)/$ RewriteRule ^(.*)$ http://www.domain.co.uk/$1/ [L,R=301] #Remove index.php RewriteCond $1 !\.(gif|jpe?g|png)$ [NC] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ /index.php?/$1 [L] RewriteCond %{THE_REQUEST} ^[^/]*/index\.php [NC] RewriteRule ^index\.php(.+) $1 [R=301,L] </IfModule> The site is built in Expressionengine and so the segments of url do not represent actual folders - I don't think this should matter though. Any ideas would be much appreciated! Thanks

    Read the article

  • Alternatives to multiple inheritance for my architecture (NPCs in a Realtime Strategy game)?

    - by Brettetete
    Coding isn't that hard actually. The hard part is to write code that makes sense, is readable and understandable. So I want to get a better developer and create some solid architecture. So I want to do create an architecture for NPCs in a video-game. It is a Realtime Strategy game like Starcraft, Age of Empires, Command & Conquers, etc etc.. So I'll have different kinds of NPCs. A NPC can have one to many abilities (methods) of these: Build(), Farm() and Attack(). Examples: Worker can Build() and Farm() Warrior can Attack() Citizen can Build(), Farm() and Attack() Fisherman can Farm() and Attack() I hope everything is clear so far. So now I do have my NPC Types and their abilities. But lets come to the technical / programmatical aspect. What would be a good programmatic architecture for my different kinds of NPCs? Okay I could have a base class. Actually I think this is a good way to stick with the DRY principle. So I can have methods like WalkTo(x,y) in my base class since every NPC will be able to move. But now lets come to the real problem. Where do I implement my abilities? (remember: Build(), Farm() and Attack()) Since the abilities will consists of the same logic it would be annoying / break DRY principle to implement them for each NPC (Worker,Warrior, ..). Okay I could implement the abilities within the base class. This would require some kind of logic that verifies if a NPC can use ability X. IsBuilder, CanBuild, .. I think it is clear what I want to express. But I don't feel very well with this idea. This sounds like a bloated base class with too much functionality. I do use C# as programming language. So multiple inheritance isn't an opinion here. Means: Having extra base classes like Fisherman : Farmer, Attacker won't work.

    Read the article

  • Moving screenshots from 1 folder to another instantly

    - by Frank
    I am hosting a gaming site and once in a while the server automatically creates a screenshot in the servers folder. Unfortunatly it is NOT configurable in the server settings where it puts these screenshots, it just allways dumps the PNG file in the server config folder. My folder structure is for example as follows: /home/Game/Server1/ now, what I would like to achieve is that once the server creates a screenshot in 1 of these server folders (I have multiple), the operating system moves the screenshot IMMEDIATLY (which is ALLWAYS a *.png file) to the webserver folder, for example: /var/www/Server1/filename.png So that players can see the screenshot on the website. Anyone any idea on how I can tackle this problem the smartest way? Please note that my ideal situation would be if the PNG file is moved immediatly after creation. Thanks for your help. Frank

    Read the article

  • 256 Windows Azure Worker Roles, Windows Kinect and a 90's Text-Based Ray-Tracer

    - by Alan Smith
    For a couple of years I have been demoing a simple render farm hosted in Windows Azure using worker roles and the Azure Storage service. At the start of the presentation I deploy an Azure application that uses 16 worker roles to render a 1,500 frame 3D ray-traced animation. At the end of the presentation, when the animation was complete, I would play the animation delete the Azure deployment. The standing joke with the audience was that it was that it was a “$2 demo”, as the compute charges for running the 16 instances for an hour was $1.92, factor in the bandwidth charges and it’s a couple of dollars. The point of the demo is that it highlights one of the great benefits of cloud computing, you pay for what you use, and if you need massive compute power for a short period of time using Windows Azure can work out very cost effective. The “$2 demo” was great for presenting at user groups and conferences in that it could be deployed to Azure, used to render an animation, and then removed in a one hour session. I have always had the idea of doing something a bit more impressive with the demo, and scaling it from a “$2 demo” to a “$30 demo”. The challenge was to create a visually appealing animation in high definition format and keep the demo time down to one hour.  This article will take a run through how I achieved this. Ray Tracing Ray tracing, a technique for generating high quality photorealistic images, gained popularity in the 90’s with companies like Pixar creating feature length computer animations, and also the emergence of shareware text-based ray tracers that could run on a home PC. In order to render a ray traced image, the ray of light that would pass from the view point must be tracked until it intersects with an object. At the intersection, the color, reflectiveness, transparency, and refractive index of the object are used to calculate if the ray will be reflected or refracted. Each pixel may require thousands of calculations to determine what color it will be in the rendered image. Pin-Board Toys Having very little artistic talent and a basic understanding of maths I decided to focus on an animation that could be modeled fairly easily and would look visually impressive. I’ve always liked the pin-board desktop toys that become popular in the 80’s and when I was working as a 3D animator back in the 90’s I always had the idea of creating a 3D ray-traced animation of a pin-board, but never found the energy to do it. Even if I had a go at it, the render time to produce an animation that would look respectable on a 486 would have been measured in months. PolyRay Back in 1995 I landed my first real job, after spending three years being a beach-ski-climbing-paragliding-bum, and was employed to create 3D ray-traced animations for a CD-ROM that school kids would use to learn physics. I had got into the strange and wonderful world of text-based ray tracing, and was using a shareware ray-tracer called PolyRay. PolyRay takes a text file describing a scene as input and, after a few hours processing on a 486, produced a high quality ray-traced image. The following is an example of a basic PolyRay scene file. background Midnight_Blue   static define matte surface { ambient 0.1 diffuse 0.7 } define matte_white texture { matte { color white } } define matte_black texture { matte { color dark_slate_gray } } define position_cylindrical 3 define lookup_sawtooth 1 define light_wood <0.6, 0.24, 0.1> define median_wood <0.3, 0.12, 0.03> define dark_wood <0.05, 0.01, 0.005>     define wooden texture { noise surface { ambient 0.2  diffuse 0.7  specular white, 0.5 microfacet Reitz 10 position_fn position_cylindrical position_scale 1  lookup_fn lookup_sawtooth octaves 1 turbulence 1 color_map( [0.0, 0.2, light_wood, light_wood] [0.2, 0.3, light_wood, median_wood] [0.3, 0.4, median_wood, light_wood] [0.4, 0.7, light_wood, light_wood] [0.7, 0.8, light_wood, median_wood] [0.8, 0.9, median_wood, light_wood] [0.9, 1.0, light_wood, dark_wood]) } } define glass texture { surface { ambient 0 diffuse 0 specular 0.2 reflection white, 0.1 transmission white, 1, 1.5 }} define shiny surface { ambient 0.1 diffuse 0.6 specular white, 0.6 microfacet Phong 7  } define steely_blue texture { shiny { color black } } define chrome texture { surface { color white ambient 0.0 diffuse 0.2 specular 0.4 microfacet Phong 10 reflection 0.8 } }   viewpoint {     from <4.000, -1.000, 1.000> at <0.000, 0.000, 0.000> up <0, 1, 0> angle 60     resolution 640, 480 aspect 1.6 image_format 0 }       light <-10, 30, 20> light <-10, 30, -20>   object { disc <0, -2, 0>, <0, 1, 0>, 30 wooden }   object { sphere <0.000, 0.000, 0.000>, 1.00 chrome } object { cylinder <0.000, 0.000, 0.000>, <0.000, 0.000, -4.000>, 0.50 chrome }   After setting up the background and defining colors and textures, the viewpoint is specified. The “camera” is located at a point in 3D space, and it looks towards another point. The angle, image resolution, and aspect ratio are specified. Two lights are present in the image at defined coordinates. The three objects in the image are a wooden disc to represent a table top, and a sphere and cylinder that intersect to form a pin that will be used for the pin board toy in the final animation. When the image is rendered, the following image is produced. The pins are modeled with a chrome surface, so they reflect the environment around them. Note that the scale of the pin shaft is not correct, this will be fixed later. Modeling the Pin Board The frame of the pin-board is made up of three boxes, and six cylinders, the front box is modeled using a clear, slightly reflective solid, with the same refractive index of glass. The other shapes are modeled as metal. object { box <-5.5, -1.5, 1>, <5.5, 5.5, 1.2> glass } object { box <-5.5, -1.5, -0.04>, <5.5, 5.5, -0.09> steely_blue } object { box <-5.5, -1.5, -0.52>, <5.5, 5.5, -0.59> steely_blue } object { cylinder <-5.2, -1.2, 1.4>, <-5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, -1.2, 1.4>, <5.2, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <-5.2, 5.2, 1.4>, <-5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <5.2, 5.2, 1.4>, <5.2, 5.2, -0.74>, 0.2 steely_blue } object { cylinder <0, -1.2, 1.4>, <0, -1.2, -0.74>, 0.2 steely_blue } object { cylinder <0, 5.2, 1.4>, <0, 5.2, -0.74>, 0.2 steely_blue }   In order to create the matrix of pins that make up the pin board I used a basic console application with a few nested loops to create two intersecting matrixes of pins, which models the layout used in the pin boards. The resulting image is shown below. The pin board contains 11,481 pins, with the scene file containing 23,709 lines of code. For the complete animation 2,000 scene files will be created, which is over 47 million lines of code. Each pin in the pin-board will slide out a specific distance when an object is pressed into the back of the board. This is easily modeled by setting the Z coordinate of the pin to a specific value. In order to set all of the pins in the pin-board to the correct position, a bitmap image can be used. The position of the pin can be set based on the color of the pixel at the appropriate position in the image. When the Windows Azure logo is used to set the Z coordinate of the pins, the following image is generated. The challenge now was to make a cool animation. The Azure Logo is fine, but it is static. Using a normal video to animate the pins would not work; the colors in the video would not be the same as the depth of the objects from the camera. In order to simulate the pin board accurately a series of frames from a depth camera could be used. Windows Kinect The Kenect controllers for the X-Box 360 and Windows feature a depth camera. The Kinect SDK for Windows provides a programming interface for Kenect, providing easy access for .NET developers to the Kinect sensors. The Kinect Explorer provided with the Kinect SDK is a great starting point for exploring Kinect from a developers perspective. Both the X-Box 360 Kinect and the Windows Kinect will work with the Kinect SDK, the Windows Kinect is required for commercial applications, but the X-Box Kinect can be used for hobby projects. The Windows Kinect has the advantage of providing a mode to allow depth capture with objects closer to the camera, which makes for a more accurate depth image for setting the pin positions. Creating a Depth Field Animation The depth field animation used to set the positions of the pin in the pin board was created using a modified version of the Kinect Explorer sample application. In order to simulate the pin board accurately, a small section of the depth range from the depth sensor will be used. Any part of the object in front of the depth range will result in a white pixel; anything behind the depth range will be black. Within the depth range the pixels in the image will be set to RGB values from 0,0,0 to 255,255,255. A screen shot of the modified Kinect Explorer application is shown below. The Kinect Explorer sample application was modified to include slider controls that are used to set the depth range that forms the image from the depth stream. This allows the fine tuning of the depth image that is required for simulating the position of the pins in the pin board. The Kinect Explorer was also modified to record a series of images from the depth camera and save them as a sequence JPEG files that will be used to animate the pins in the animation the Start and Stop buttons are used to start and stop the image recording. En example of one of the depth images is shown below. Once a series of 2,000 depth images has been captured, the task of creating the animation can begin. Rendering a Test Frame In order to test the creation of frames and get an approximation of the time required to render each frame a test frame was rendered on-premise using PolyRay. The output of the rendering process is shown below. The test frame contained 23,629 primitive shapes, most of which are the spheres and cylinders that are used for the 11,800 or so pins in the pin board. The 1280x720 image contains 921,600 pixels, but as anti-aliasing was used the number of rays that were calculated was 4,235,777, with 3,478,754,073 object boundaries checked. The test frame of the pin board with the depth field image applied is shown below. The tracing time for the test frame was 4 minutes 27 seconds, which means rendering the2,000 frames in the animation would take over 148 hours, or a little over 6 days. Although this is much faster that an old 486, waiting almost a week to see the results of an animation would make it challenging for animators to create, view, and refine their animations. It would be much better if the animation could be rendered in less than one hour. Windows Azure Worker Roles The cost of creating an on-premise render farm to render animations increases in proportion to the number of servers. The table below shows the cost of servers for creating a render farm, assuming a cost of $500 per server. Number of Servers Cost 1 $500 16 $8,000 256 $128,000   As well as the cost of the servers, there would be additional costs for networking, racks etc. Hosting an environment of 256 servers on-premise would require a server room with cooling, and some pretty hefty power cabling. The Windows Azure compute services provide worker roles, which are ideal for performing processor intensive compute tasks. With the scalability available in Windows Azure a job that takes 256 hours to complete could be perfumed using different numbers of worker roles. The time and cost of using 1, 16 or 256 worker roles is shown below. Number of Worker Roles Render Time Cost 1 256 hours $30.72 16 16 hours $30.72 256 1 hour $30.72   Using worker roles in Windows Azure provides the same cost for the 256 hour job, irrespective of the number of worker roles used. Provided the compute task can be broken down into many small units, and the worker role compute power can be used effectively, it makes sense to scale the application so that the task is completed quickly, making the results available in a timely fashion. The task of rendering 2,000 frames in an animation is one that can easily be broken down into 2,000 individual pieces, which can be performed by a number of worker roles. Creating a Render Farm in Windows Azure The architecture of the render farm is shown in the following diagram. The render farm is a hybrid application with the following components: ·         On-Premise o   Windows Kinect – Used combined with the Kinect Explorer to create a stream of depth images. o   Animation Creator – This application uses the depth images from the Kinect sensor to create scene description files for PolyRay. These files are then uploaded to the jobs blob container, and job messages added to the jobs queue. o   Process Monitor – This application queries the role instance lifecycle table and displays statistics about the render farm environment and render process. o   Image Downloader – This application polls the image queue and downloads the rendered animation files once they are complete. ·         Windows Azure o   Azure Storage – Queues and blobs are used for the scene description files and completed frames. A table is used to store the statistics about the rendering environment.   The architecture of each worker role is shown below.   The worker role is configured to use local storage, which provides file storage on the worker role instance that can be use by the applications to render the image and transform the format of the image. The service definition for the worker role with the local storage configuration highlighted is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceDefinition name="CloudRay" >   <WorkerRole name="CloudRayWorkerRole" vmsize="Small">     <Imports>     </Imports>     <ConfigurationSettings>       <Setting name="DataConnectionString" />     </ConfigurationSettings>     <LocalResources>       <LocalStorage name="RayFolder" cleanOnRoleRecycle="true" />     </LocalResources>   </WorkerRole> </ServiceDefinition>     The two executable programs, PolyRay.exe and DTA.exe are included in the Azure project, with Copy Always set as the property. PolyRay will take the scene description file and render it to a Truevision TGA file. As the TGA format has not seen much use since the mid 90’s it is converted to a JPG image using Dave's Targa Animator, another shareware application from the 90’s. Each worker roll will use the following process to render the animation frames. 1.       The worker process polls the job queue, if a job is available the scene description file is downloaded from blob storage to local storage. 2.       PolyRay.exe is started in a process with the appropriate command line arguments to render the image as a TGA file. 3.       DTA.exe is started in a process with the appropriate command line arguments convert the TGA file to a JPG file. 4.       The JPG file is uploaded from local storage to the images blob container. 5.       A message is placed on the images queue to indicate a new image is available for download. 6.       The job message is deleted from the job queue. 7.       The role instance lifecycle table is updated with statistics on the number of frames rendered by the worker role instance, and the CPU time used. The code for this is shown below. public override void Run() {     // Set environment variables     string polyRayPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), PolyRayLocation);     string dtaPath = Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), DTALocation);       LocalResource rayStorage = RoleEnvironment.GetLocalResource("RayFolder");     string localStorageRootPath = rayStorage.RootPath;       JobQueue jobQueue = new JobQueue("renderjobs");     JobQueue downloadQueue = new JobQueue("renderimagedownloadjobs");     CloudRayBlob sceneBlob = new CloudRayBlob("scenes");     CloudRayBlob imageBlob = new CloudRayBlob("images");     RoleLifecycleDataSource roleLifecycleDataSource = new RoleLifecycleDataSource();       Frames = 0;       while (true)     {         // Get the render job from the queue         CloudQueueMessage jobMsg = jobQueue.Get();           if (jobMsg != null)         {             // Get the file details             string sceneFile = jobMsg.AsString;             string tgaFile = sceneFile.Replace(".pi", ".tga");             string jpgFile = sceneFile.Replace(".pi", ".jpg");               string sceneFilePath = Path.Combine(localStorageRootPath, sceneFile);             string tgaFilePath = Path.Combine(localStorageRootPath, tgaFile);             string jpgFilePath = Path.Combine(localStorageRootPath, jpgFile);               // Copy the scene file to local storage             sceneBlob.DownloadFile(sceneFilePath);               // Run the ray tracer.             string polyrayArguments =                 string.Format("\"{0}\" -o \"{1}\" -a 2", sceneFilePath, tgaFilePath);             Process polyRayProcess = new Process();             polyRayProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), polyRayPath);             polyRayProcess.StartInfo.Arguments = polyrayArguments;             polyRayProcess.Start();             polyRayProcess.WaitForExit();               // Convert the image             string dtaArguments =                 string.Format(" {0} /FJ /P{1}", tgaFilePath, Path.GetDirectoryName (jpgFilePath));             Process dtaProcess = new Process();             dtaProcess.StartInfo.FileName =                 Path.Combine(Environment.GetEnvironmentVariable("RoleRoot"), dtaPath);             dtaProcess.StartInfo.Arguments = dtaArguments;             dtaProcess.Start();             dtaProcess.WaitForExit();               // Upload the image to blob storage             imageBlob.UploadFile(jpgFilePath);               // Add a download job.             downloadQueue.Add(jpgFile);               // Delete the render job message             jobQueue.Delete(jobMsg);               Frames++;         }         else         {             Thread.Sleep(1000);         }           // Log the worker role activity.         roleLifecycleDataSource.Alive             ("CloudRayWorker", RoleLifecycleDataSource.RoleLifecycleId, Frames);     } }     Monitoring Worker Role Instance Lifecycle In order to get more accurate statistics about the lifecycle of the worker role instances used to render the animation data was tracked in an Azure storage table. The following class was used to track the worker role lifecycles in Azure storage.   public class RoleLifecycle : TableServiceEntity {     public string ServerName { get; set; }     public string Status { get; set; }     public DateTime StartTime { get; set; }     public DateTime EndTime { get; set; }     public long SecondsRunning { get; set; }     public DateTime LastActiveTime { get; set; }     public int Frames { get; set; }     public string Comment { get; set; }       public RoleLifecycle()     {     }       public RoleLifecycle(string roleName)     {         PartitionKey = roleName;         RowKey = Utils.GetAscendingRowKey();         Status = "Started";         StartTime = DateTime.UtcNow;         LastActiveTime = StartTime;         EndTime = StartTime;         SecondsRunning = 0;         Frames = 0;     } }     A new instance of this class is created and added to the storage table when the role starts. It is then updated each time the worker renders a frame to record the total number of frames rendered and the total processing time. These statistics are used be the monitoring application to determine the effectiveness of use of resources in the render farm. Rendering the Animation The Azure solution was deployed to Windows Azure with the service configuration set to 16 worker role instances. This allows for the application to be tested in the cloud environment, and the performance of the application determined. When I demo the application at conferences and user groups I often start with 16 instances, and then scale up the application to the full 256 instances. The configuration to run 16 instances is shown below. <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="16" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     About six minutes after deploying the application the first worker roles become active and start to render the first frames of the animation. The CloudRay Monitor application displays an icon for each worker role instance, with a number indicating the number of frames that the worker role has rendered. The statistics on the left show the number of active worker roles and statistics about the render process. The render time is the time since the first worker role became active; the CPU time is the total amount of processing time used by all worker role instances to render the frames.   Five minutes after the first worker role became active the last of the 16 worker roles activated. By this time the first seven worker roles had each rendered one frame of the animation.   With 16 worker roles u and running it can be seen that one hour and 45 minutes CPU time has been used to render 32 frames with a render time of just under 10 minutes.     At this rate it would take over 10 hours to render the 2,000 frames of the full animation. In order to complete the animation in under an hour more processing power will be required. Scaling the render farm from 16 instances to 256 instances is easy using the new management portal. The slider is set to 256 instances, and the configuration saved. We do not need to re-deploy the application, and the 16 instances that are up and running will not be affected. Alternatively, the configuration file for the Azure service could be modified to specify 256 instances.   <?xml version="1.0" encoding="utf-8"?> <ServiceConfiguration serviceName="CloudRay" xmlns="http://schemas.microsoft.com/ServiceHosting/2008/10/ServiceConfiguration" osFamily="1" osVersion="*">   <Role name="CloudRayWorkerRole">     <Instances count="256" />     <ConfigurationSettings>       <Setting name="DataConnectionString"         value="DefaultEndpointsProtocol=https;AccountName=cloudraydata;AccountKey=..." />     </ConfigurationSettings>   </Role> </ServiceConfiguration>     Six minutes after the new configuration has been applied 75 new worker roles have activated and are processing their first frames.   Five minutes later the full configuration of 256 worker roles is up and running. We can see that the average rate of frame rendering has increased from 3 to 12 frames per minute, and that over 17 hours of CPU time has been utilized in 23 minutes. In this test the time to provision 140 worker roles was about 11 minutes, which works out at about one every five seconds.   We are now half way through the rendering, with 1,000 frames complete. This has utilized just under three days of CPU time in a little over 35 minutes.   The animation is now complete, with 2,000 frames rendered in a little over 52 minutes. The CPU time used by the 256 worker roles is 6 days, 7 hours and 22 minutes with an average frame rate of 38 frames per minute. The rendering of the last 1,000 frames took 16 minutes 27 seconds, which works out at a rendering rate of 60 frames per minute. The frame counts in the server instances indicate that the use of a queue to distribute the workload has been very effective in distributing the load across the 256 worker role instances. The first 16 instances that were deployed first have rendered between 11 and 13 frames each, whilst the 240 instances that were added when the application was scaled have rendered between 6 and 9 frames each.   Completed Animation I’ve uploaded the completed animation to YouTube, a low resolution preview is shown below. Pin Board Animation Created using Windows Kinect and 256 Windows Azure Worker Roles   The animation can be viewed in 1280x720 resolution at the following link: http://www.youtube.com/watch?v=n5jy6bvSxWc Effective Use of Resources According to the CloudRay monitor statistics the animation took 6 days, 7 hours and 22 minutes CPU to render, this works out at 152 hours of compute time, rounded up to the nearest hour. As the usage for the worker role instances are billed for the full hour, it may have been possible to render the animation using fewer than 256 worker roles. When deciding the optimal usage of resources, the time required to provision and start the worker roles must also be considered. In the demo I started with 16 worker roles, and then scaled the application to 256 worker roles. It would have been more optimal to start the application with maybe 200 worker roles, and utilized the full hour that I was being billed for. This would, however, have prevented showing the ease of scalability of the application. The new management portal displays the CPU usage across the worker roles in the deployment. The average CPU usage across all instances is 93.27%, with over 99% used when all the instances are up and running. This shows that the worker role resources are being used very effectively. Grid Computing Scenarios Although I am using this scenario for a hobby project, there are many scenarios where a large amount of compute power is required for a short period of time. Windows Azure provides a great platform for developing these types of grid computing applications, and can work out very cost effective. ·         Windows Azure can provide massive compute power, on demand, in a matter of minutes. ·         The use of queues to manage the load balancing of jobs between role instances is a simple and effective solution. ·         Using a cloud-computing platform like Windows Azure allows proof-of-concept scenarios to be tested and evaluated on a very low budget. ·         No charges for inbound data transfer makes the uploading of large data sets to Windows Azure Storage services cost effective. (Transaction charges still apply.) Tips for using Windows Azure for Grid Computing Scenarios I found the implementation of a render farm using Windows Azure a fairly simple scenario to implement. I was impressed by ease of scalability that Azure provides, and by the short time that the application took to scale from 16 to 256 worker role instances. In this case it was around 13 minutes, in other tests it took between 10 and 20 minutes. The following tips may be useful when implementing a grid computing project in Windows Azure. ·         Using an Azure Storage queue to load-balance the units of work across multiple worker roles is simple and very effective. The design I have used in this scenario could easily scale to many thousands of worker role instances. ·         Windows Azure accounts are typically limited to 20 cores. If you need to use more than this, a call to support and a credit card check will be required. ·         Be aware of how the billing model works. You will be charged for worker role instances for the full clock our in which the instance is deployed. Schedule the workload to start just after the clock hour has started. ·         Monitor the utilization of the resources you are provisioning, ensure that you are not paying for worker roles that are idle. ·         If you are deploying third party applications to worker roles, you may well run into licensing issues. Purchasing software licenses on a per-processor basis when using hundreds of processors for a short time period would not be cost effective. ·         Third party software may also require installation onto the worker roles, which can be accomplished using start-up tasks. Bear in mind that adding a startup task and possible re-boot will add to the time required for the worker role instance to start and activate. An alternative may be to use a prepared VM and use VM roles. ·         Consider using the Windows Azure Autoscaling Application Block (WASABi) to autoscale the worker roles in your application. When using a large number of worker roles, the utilization must be carefully monitored, if the scaling algorithms are not optimal it could get very expensive!

    Read the article

  • Webinar: MySQL Enterprise Backup - Online "Hot" Backup for MySQL

    - by mike.frank(at)oracle.com
    Online backup has been one of the most requested features for MySQL. With MySQL Enterprise Backup, developers and DBAs have tools they need to safely and rapidly backup and restore their databases. In this webinar we will go into the advantages of Hot "Online" backups. We will show how MySQL Enterprise Backup supports full, incremental, partial, and compressed backups that allow you to perform consistent Point-in-Time Recovery, as well as saving both time and money.In this Free Webinar you will learn:    * Backup Strategies & Methods    * Comparison of backup types for MySQL    * MySQL Enterprise Backup: Features    * MySQL Enterprise Backup  Performance    * MySQL Enterprise Backup: Architecture    * MySQL Enterprise Backup: How it Works    * MySQL Enterprise Backup: Script ExamplesEnglish WebinarWhoMike Frank and Alex Roedling WhenThursday, January 20, 2011: 09:00 Pacific time (English)Italian WebinarLuca Olivari Thursday, January 20, 2011: 10:00 Central European time (Italian)Register now: English, Italian.On demand French and German versions available as well.Related articles    * Introducing our "Hot" MySQL Enterprise Backup (blogs.oracle.com)

    Read the article

  • OEPE with ADF binding support available: Total Eclipse

    - by Frank Nimphius
    The current release of Oracle Enterprise Pack for Eclipse, though in technology preview, brings Oracle ADF binding to the Eclipse IDE. You can download the Software from the link below: Oracle Enterprise Pack for Eclipse (12.1.1.1.0) Technical Preview New June 2012 Certified on Windows 7/XP/Vista, MacOS, and Linux. Supported on JDK 6. For many Eclipse users, ADF is new and therefore I expect them to need guidance and help in case they run into issues they don't know how to recover from. Similar, ADF users familiar with Oracle JDeveloper that want to give OEPE a try, will find things different in Eclipse and thus may have questions.  For both audiences I suggest to post issues to the OEPE forum on the Oracle Technology Network: I'll extend my OTN monitoring to include the OEPE forum on a daily basis to learn about developer needs, requirements and - of course - to catch bugs that need to be filed. From my side this is a part-time involvement, which means that the more ADF questions show on the forum, the more help I could need in answering them. The OTN forum for JDeveloper in my opnion wouldn't be the right place to go to unless the question is a generic ADF question that is not dependent on the integration in Eclipse. Here's the OEPE forum link for a start https://forums.oracle.com/forums/forum.jspa?forumID=578 Frank

    Read the article

  • Load Balance WCF and Share a Remote MSMQ for High Throughput

    - by BarDev
    After a ton of reading in books and on the web, I have noticed hints of information that WCF and MSMQ can be used in achieving high throughput. The information I have seen mentions using multiple WCF services in a farm that reads from a single MSMQ queue. The problem is that I have found paragraphs here and there that mentions that high throughput can be done, but I cannot seem to find a document of how to implement it. The following is an excerpt from a MSDN article. The following paragraph is from Best Practices for Queued Communication http://msdn.microsoft.com/en-us/library/ms731093.aspx To achieve higher throughput and availability, use a farm of WCF services that read from the queue. This requires that all of these services expose the same contract on the same endpoint. The farm approach works best for applications that have high production rates of messages because it enables a number of services to all read from the same queue. This is what I'm trying to solve. I have an intranet application where a client sends a request to a WCF service. But I want the ability to load balance the WCF services on multiple servers in a farm. I also want these WCF services in the farm to do transactional reads from a remote MSMQ when an item is available in the Queue. If this is possible, an issue I have is that I do not understand the activation process of WCF to retrieve messages from a remote queue. If this is possible, does anyone know of any articles or Webcasts that would explain it in detail? BarDev

    Read the article

  • CodePlex Daily Summary for Thursday, June 02, 2011

    CodePlex Daily Summary for Thursday, June 02, 2011Popular ReleasesDotRas: DotRas v1.2 (Version 1.2.4168): This release includes compiled (and signed) versions of the binaries, PDBs, CHM help documentation, along with both C# and VB.NET examples. Please don't forget to rate the release! If you find a bug, please open a work item and give as much description as possible. Stack traces, which operating system(s) you're targeting, and build type is also very helpful for bug hunting. If you find something you believe to be a bug but are not sure, create a new discussion on the discussions board. Thank...Caliburn Micro: WPF, Silverlight and WP7 made easy.: Caliburn.Micro v1.1 RTW: Download ContentsDebug and Release Assemblies Samples Changes.txt License.txt Release Highlights For WP7A new Tombstoning API based on ideas from Fluent NHibernate A new Launcher/Chooser API Strongly typed Navigation SimpleContainer included The full phone lifecycle is made easy to work with ie. we figure out whether your app is actually Resurrecting or just Continuing for you For WPFSupport for the Client Profile Better support for WinForms integration All PlatformsA power...Pssdiag and Sqldiag Manager: Diag Manager Release 10.5.1.202: This release fixed a bug regarding saving the final package. If you browse a folder when saving the configured package, the default file name will change back to pssd.exe. It should be pssd.cab.VidCoder: 0.9.1: Added color coding to the Log window. Errors are highlighted in red, HandBrake logs are in black and VidCoder logs are in dark blue. Moved enqueue button to the right with the other control buttons. Added logic to report failures when errors are logged during the encode or when the encode finishes prematurely. Added Copy button to Log window. Adjusted audio track selection box to always show the full track name. Changed encode job progress bar to also be colored yellow when the enco...Terraria Map Generator: TerrariaMapTool 1.0.0.3 Beta: 1) Catch all exception from the gui app. 2) Fixed the use of the graphics device to really use reach.AutoLoL: AutoLoL v2.0.1: - Fixed a small bug in Auto Login - Fixed the updaterEPPlus-Create advanced Excel 2007 spreadsheets on the server: EPPlus 2.9.0.1: EPPlus-Create advanced Excel 2007 spreadsheets on the server This version has been updated to .Net Framework 3.5 New Features Data Validation. PivotTables (Basic functionalliy...) Support for worksheet data sources. Column, Row, Page and Data fields. Date and Numeric grouping Build in styles. ...and more And some minor new features... Ranges Text-Property|Get the formated value AutofitColumns-method to set the column width from the content of the range LoadFromCollection-metho...jQuery ASP.Net MVC Controls: Version 1.4.0.0: Version 1.4.0.0 contains the following additions: Upgraded to MVC 3.0 Upgraded to jQuery 1.6.1 (Though the project supports all jQuery version from 1.4.x onwards) Upgraded to jqGrid 3.8 Better Razor View-Engine support Better Pager support, includes support for custom pagers Added jqGrid toolbar buttons support Search module refactored, with full suport for multiple filters and ordering And Code cleanup, bug-fixes and better controller configuration support.Restbucks on .Net: RestBucks Alpha 1: This is the first release of the application.Nearforums - ASP.NET MVC forum engine: Nearforums v6.0: Version 6.0 of Nearforums, the ASP.NET MVC Forum Engine, containing new features: Authentication using Membership Provider for SQL Server and MySql Spam prevention: Flood Control Moderation: Flag messages Content management: Pages: Create pages (about us/contact/texts) through web administration Allow nearforums to run as an IIS subapp Migrated Facebook Connect to OAuth 2.0 Visit the project Roadmap for more details.NetOffice - The easiest way to use Office in .NET: NetOffice Release 0.8b: Changes: - fix critical issue 15922(AccessViolationException) once and for all update is strongly recommended Includes: - Runtime Binaries and Source Code for .NET Framework:......v2.0, v3.0, v3.5, v4.0 - Tutorials in C# and VB.Net:..............................................................COM Proxy Management, Events, etc. - Examples in C# and VB.Net:............................................................Excel, Word, Outlook, PowerPoint, Access - COMAddin Examples in C# and VB....Facebook Graph Toolkit: Facebook Graph Toolkit 1.5.4186: Updates the API in response to Facebook's recent change of policy: All Graph Api accessing feeds or posts must provide a AccessToken.SharePoint Farm Poster: SharePoint Farm Poster: SharePoint Farm Poster is generated by a PowerShell Script. Run this script under the Farm Admin Account. After downloading, unblock the file in the Property Window. Current version is beta : v0.3.7Serviio for Windows Home Server: Beta Release 0.5.2.0: Ready for widespread beta. Synchronized build number to Serviio version to avoid confusion.AcDown????? - Anime&Comic Downloader: AcDown????? v3.0 Beta4: ??AcDown?????????????,??????????????,????、????。?????Acfun????? ????32??64? Windows XP/Vista/7 ????????????? ??:????????Windows XP???,?????????.NET Framework 2.0???(x86)?.NET Framework 2.0???(x64),?????"?????????"??? ??v3.0 Beta4 2011-5-31?? ???Bilibili.us????? ???? ?? ???"????" ???Bilibili.us??? ??????? ?? ??????? ?? ???????? ?? ?? ???Bilibili.us?????(??????????????????) ??????(6.cn)?????(????) ?? ?????Acfun?????????? ?????????????? ???QQ???????? ????????????Discussion...ClosedXML - The easy way to OpenXML: ClosedXML 0.53.2: New on this release: 1) Added FindCells, FindColumns, and FindRows methods to the workbook object. 2) Added cell.ValueCached property that holds Excel's cache of formula evaluations. 3) Added the following properties to the worksheets: ShowFormulas ShowGridLines ShowOutlineSymbols ShowRowColHeaders ShowRuler ShowWhiteSpace ShowZeros 4) Added methods Worksheets.Add(DataTable), Worksheets.Add(DataSet). See examples Adding DataTable as Worksheet, Adding DataSet 5) Fixed issues 6609,...CodeCopy Auto Code Converter: Code Copy v0.1: Full add-in, setup project source code and setup fileWordpress Theme 'Windows Metro': v1.00.0017: v1.00.0017 Dies ist eine Vorab-Version aus der Entwickler-Phase. This is a preview version from the development phase.Kooboo CMS: Kooboo CMS 3.02: Updated the Kooboo_CMS.zip at 2011-06-02 11:44 GMT 1.Fixed: Adding data rule issue on page. 2.Fixed: Caching issue for higher performance. Updated the Kooboo_CMS.zip at 2011-06-01 10:00 GMT 1. Fixed the published package throwed a compile error under WebSite mode. 2. Fixed the ContentHelper.NewMediaFolderObject return TextFolder issue. 3. Shorten the name of ContentHelper API. NewMediaFolderObject=>MediaFolder, NewTextFolderObject=> TextFolder, NewSchemaObject=>Schema. Also update the C...mojoPortal: 2.3.6.6: see release notes on mojoportal.com http://www.mojoportal.com/mojoportal-2366-released Note that we have separate deployment packages for .NET 3.5 and .NET 4.0 The deployment package downloads on this page are pre-compiled and ready for production deployment, they contain no C# source code. To download the source code see the Source Code Tab I recommend getting the latest source code using TortoiseHG, you can get the source code corresponding to this release here.New ProjectsAgile DDD Framework: Agile DDD FrameworkEnigma Suspicious Indicators Script: Enigma is a bash script that parses known suspicious indicators data from open and close source intelligence feeds and dynamiclly sends the data into ArcSight via CEF syslog. Please unzip the latest build and follow the instructions in the install.txt file.Fonqi ColorPicker for Umbraco: A Color Picker datatype for the Umbraco CMS.Fonqi.TemplatePicker Datatype for Umbraco: A template picker datatype for the umbraco cms. Implemented with the umbraco usercontrol wrapperGeneric Library: Still in building...Microsoft Surface StarterKit for VB.NET: Currently you can only create applications for Microsoft Surface in C #. Microsoft Surface Starterkit for VB.NET is an initial project of WPF that implements all the necessary references to create a Microsoft Surface application in Visual Basic. Actualmente solo se pueden crear aplicaciones para Microsoft Surface en C#. Microsoft Surface StarterKit for VB.NET es un proyecto inicial de WPF que implementa todas las referencias necesarias para crear una aplicación Microsoft Surface en Visual ...OpenCV Video Writer: For OpenCV2.2 Main Function: Use mouse to draw on window, edited video will be made in realtime. Support fading out effect and translucence drawing. This program use a read-edit-write model to achieve Realtime editing of video. Social Playlist: The Social Playlist Module is a DNN module that enables musicians to upload and manage their music online (possibly sale) and enables other users (not musician) to create personal playlist. The dynamics between musicians and user is similar to sites like MySpace and Ilike.splitbook: splitbookSQL Compact Query Analyzer: SQL Server Compact Edition Database Query AnalyzerTerm Store Migration Tool: Console application for migrating term store from one farm to another SharePoint farm. The tool exports the term store in an xml format which can be restored to same/another farm with same GUID's.TFS fa schifo: Questo progetto è da intendersi come dimostrazione del fatto che TFS fa veramente, inesorabilmente, indistintamente e veritatamente schifoTypheus Browser: A simple .Net browser based of the Typheus Browser from Homestuck.Ultimate Explorer: A Windows Explorer alternative.v76.DictionaryPicker for Umbraco: A Dictionary Picker Datatype for Umbraco CMS

    Read the article

  • KDE4: Share Nepomuk data

    - by Frank Becker
    Hi all, is it possible to share the nepomuk data? I have several users here and the users should be able to read and modify the tags and comments for each file with public access. Also it should be possible to tag and comment the files in each private home-directory. Is this possible with the current KDE 4.3.2? Thanks for your help Frank

    Read the article

  • Windows 2008 R2 Terminal Server - RemoteApp - Option "Start in" available?

    - by Frank
    we have set up a Windows 2008 R2 Terminal Server. On this server, we want to run an application which needs another path for the option "Start in" than the application itself. In a shortcut, this is no problem (for example, the "Target" is C:\app\bin\application.exe and the "Start in:" is C:\app\users\user1). But I have no option for "Start in" when I configure the RemoteApp so the application fails to start. Is there any possibility? Thank you in advance Frank

    Read the article

< Previous Page | 1 2 3 4 5 6 7 8 9 10 11 12  | Next Page >