Search Results

Search found 4879 results on 196 pages for 'geeks'.

Page 123/196 | < Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >

  • SharePoint 2010 Hosting :: How to Create an External Content Type SharePoint 2010

    - by mbridge
    In this simple Article trying to show how SharePoint Designer 2010 more the External Content Type to External Database are very easy to create and can be integrated with our SharePoint Portals. You can download SharePoint Designer 2010 here: http://www.microsoft.com/downloads/en/details.aspx?FamilyID=d88a1505-849b-4587-b854-a7054ee28d66&displaylang=en For this Example I will create a Database in SQL Server and will use SharePoint Designer 2010 to create the connections and use as a mirror from our SharePoint Portal using List and the Database. The first thing we need to do, is connect to SQL Server and create our Database call “Contacts” and add the Table “Contact” with the following fields.  When we create the External Content Type. We  will need to associate the Content Type, in this case i am using the Generic List, then we can create the Connection to the external Data Source. After create the Connection to the Database we can define what Columns we will use and what operations we will add our custom List. For this example i select all Operation they came default. This operation are very important because the Business rules are defined in each operation. After we create the diferent operations we can create the Custom List and define the how will be the Operation and add the Name for our custom List.  If you try to access the New Custom List Call “Custom Contact” you will see we will not have access to the Business Data Connectivity. To Resolve this issue we will need to give Access and permissions to users to the Custom External Content Type BDC connection in the Central administration.  Access to Central Administration Page and select the option “Service Application Tab> Manage Service Application”. There you select the Service “Business Data Connectivity Service” then select “Manage”.  This Option will list all External Content Type, choose the External Content Type we create and select the option “Set Object Permission”, this option will allow to add users to the BDC and manage the permissions to the Custom List.  After the correct permissions are given we can Access to Data on our custom Contact List and start creating new Item and all the other options and operation we define to the same List.  Hope you like this litle Article about connect Database Content to SharePoint Portal using the Externa Content Types and BCS.Thank you.

    Read the article

  • Free book from Microsoft on on Office 365

    - by TATWORTH
    At http://blogs.msdn.com/b/microsoft_press/archive/2011/08/17/free-ebook-microsoft-office-365-connect-and-collaborate-virtually-anywhere-anytime.aspx  you can get a free book from Microsoft on Office 365 (Epub and mobi formats are available at http://blogs.msdn.com/b/microsoft_press/archive/2011/09/23/free-ebook-microsoft-office-365-connect-and-collaborate-virtually-anywhere-anytime-now-in-more-formats.aspx)

    Read the article

  • From Pocket to Instapaper

    - by Michael Freidgeim
    Some time ago I’ve described the issues that I’ve had since a new version of Read It Later, named Pocket, was introduced.I’ve waited with hope for a new upgrade, but I had a huge disappointment with the latest version 16 June 2012. It didn’t fixed any of the two major problems, that I  experienced since new Pocket was introduced-  1. iPad app still didn’t show many of the saved links. 2. ability to rename articles on iPad still wasn’t restored.I’ve posted the message into their forum. They did not show my comment on their forum( I would name it censorship, not moderation), but a few days ago I’ve received an email, recommending “try logging out of the app on your iPad, and back in again.” Their suggestion helped,  but I don’t understand, why it is not posted as a recommendation on their support site.So I decided to try InstAPaper on my iPad, Previously I’ve used it for Kindle. I never considered it before on iPad, because there were no free demo and I was very satisfied with RIL free and then RIL Pro. Currently InstAPaper cost $3, so the price is not an issue.I’ve checked that it has most of features that I am using(e.g. renaming, folders) and I am quite happy with it now. Actually I am using Pocket (or RIL free) for old bookmarks( I have 1000+ stored on my iPad) and for new bookmarks I am using InstAPaper.Having a solid experience with RIL/Pocket I’ve created a list of suggestions to Marco Arment to implement.1. Some pages stored in InstAPaper have removed essential sections of the text. E.g in many blogs comments are not stored in  InstAPaper. Some pages lost almost all of important links (e.g. http://www.lib.rus.ec/a/32416 -sorry, in Russian). RIL/Pocket has 2 modes to store offline- Web view and Article view. Web View includes all links/images of the original page, but it’s very reliable. Article view suppose to strip unrelated information, but often corrupts the content. I prefer to use offline Web view.InstAPaper should also support offline Web view, in case if stripped view removes important part of content.2.  Black full screen Saving on iPad Safari is very annoying. After user pressed a bookmark, the saving has some delay and then for a few seconds prevents from reading the text.Would be better to show as message on the top part(as in Pocket ). I am surprised, that  a full screen popup was  implemented recently as a desired feature. 3.There are no comments allowed on http://blog.instapaper.com/. I would prefer to post some of these notes as comments on http://blog.instapaper.com/ rather than write them in my blog and then send link to Marco.(I found recommendation how to add support of comments on tumblr at http://www.tumblr.com/help, but then realized that Marko was the lead developer ofTumblr.)4. Also there is no support forum. I understand that maintenance of the forum ican be a hassle, but stackexchange fSome time ago I’ve described the issues that I’ve had since a new version of Read It Later, named Pocket, was introduced.I’ve waited with hope for a new upgrade, but I had a huge disappointment with the latest version 16 June 2012. It didn’t fixed any of the two major problems, that I  experienced since new Pocket was introduced- orums can be referred on  http://www.instapaper.com/main/support page, i.e.http://webapps.stackexchange.com/search?q=Instapaper  or http://apple.stackexchange.com/search?q=Instapaper 5. Tags are more convenient than folders. i.e. an ability for the same article to have more than one tag. Also creating of new folders is not supported offline, which is an annoying limitation.6. I would like to have a narrow list - additionally to existing list modes have a subject only list or subject+site list to show more list items on a screen.7. Limit of 500 offline articles sounds quite big, but my RIL list exceeded 1000, so it could be a issue in the future.8. Search button on iPad version is visible, but doesn’t work- it forces to buy Premium subscription. I think, that it’s not correct. If the button in a paid version is visible and enabled, it should  provide  a working functionality, e.g. search in article names only. And leave full-text search for the premium support.9..Copy URL is an important operation and deserves to be in a first level of Action menu, rather than in Share sub-menu.I’ve also have comment re post http://www.marco.org/2011/04/28/removed-instapaper-free. Marco Arment  explained, why he doesn’t provide free version of Instapaper.  I believe that he is loosing essential part of his customers. When I decided which of iPad application to choose, I’ve selected RIL, because I was able to play with free version, and I liked it. I didn’t have a chance to compare RIL and InstAPaper on iPad, so I’ve bought  RIL pro. For a user there is no point to pay even $3 , if there are similar free product, that user can try and see, is it suitable for him/her.I’ve also played with Readability. It doesn’t have folders or tags(which is very important for me), but nicely supports full text search

    Read the article

  • Stop trying to be perfect

    - by Kyle Burns
    Yes, Bob is my uncle too.  I also think the points in the Manifesto for Software Craftsmanship (manifesto.softwarecraftsmanship.org) are all great.  What amazes me is that tend to confuse the term “well crafted” with “perfect”.  I'm about to say something that will make Quality Assurance managers and many development types as well until you think about it as a craftsman – “Stop trying to be perfect”. Now let me explain what I mean.  Building software, as with building almost anything, often involves a series of trade-offs where either one undesired characteristic is accepted as necessary to achieve another desired one (or maybe stave off one that is even less desirable) or a desirable characteristic is sacrificed for the same reasons.  This implies that perfection itself is unattainable.  What is attainable is “sufficient” and I think that this really goes to the heart both of what people are trying to do with Agile and with the craftsmanship movement.  Simply put, sufficient software drives the greatest business value.   I've been in many meetings where “how can we keep anything from ever going wrong” has become the thing that holds us in analysis paralysis.  I've also been the guy trying way too hard to perfect some function to make sure that every edge case is accounted for.  Somewhere in there, something a drill instructor said while I was in boot camp occurred to me.  In response to being asked a question by another recruit having to do with some edge case (I can barely remember the context), he said “What if grasshoppers had machine guns?  Would the birds still **** with them?”  It sounds funny, but there's a lot of wisdom in those words.   “Sufficient” is different for every situation and it’s important to understand what sufficient means in the context of the work you’re doing.  If I’m writing a timesheet application (and please shoot me if I am), I’m going to have a much higher tolerance for imperfection than if you’re writing software to control life support systems on spacecraft.  I’m also likely to have less need for high volume performance than if you’re writing software to control stock trading transactions.   I’d encourage anyone who has read this far to instead of trying to be perfect, try to create software that is sufficient in every way.  If you’re working to make a component that is sufficient “better”, ask yourself if there is any component left that is not yet sufficient.  If the answer is “yes” you’re working on the wrong thing and need to adjust.  If the answer is “no”, why aren’t you shipping and delivering business value?

    Read the article

  • Benefits of Behavior Driven Development

    - by Aligned
    Originally posted on: http://geekswithblogs.net/Aligned/archive/2013/07/26/benefits-of-behavior-driven-development.aspxContinuing my previous article on BDD, I wanted to point out some benefits of BDD and since BDD is an extension of Test Driven Development (TDD), you get those as well. I’ll add another article on some possible downsides of this approach. There are many articles about the benefits of TDD and they apply to BDD. I’ve pointed out some here and copied some of the main points for each article, but there are many more including the book The Art of Unit Testing by Roy Osherove. http://geekswithblogs.net/leesblog/archive/2008/04/30/the-benefits-of-test-driven-development.aspx (Lee Brandt) Stability Accountability Design Ability Separated Concerns Progress Indicator http://tddftw.com/benefits-of-tdd/ Help maintainers understand the intention behind the code Bring validation and proper data handling concerns to the forefront. Writing the tests first is fun. Better APIs come from writing testable code. TDD will make you a better developer. http://www.slideshare.net/dhelper/benefit-from-unit-testing-in-the-real-world (from Typemock). Take a look at the slides, especially the extra time required for TDD (slide 10) and the next one of the bugs avoided using TDD (slide 11). Less bugs (slide 11) about testing and development (13) Increase confidence in code (14) Fearlessly change your code (14) Document Requirements (14) also see http://visualstudiomagazine.com/articles/2013/06/01/roc-rocks.aspx Discover usability issues early (14) All these points and articles are great and there are many more. The following are my additions to the benefits of BDD from using it in real projects for my company. July 2013 on MSDN - Behavior-Driven Design with SpecFlow Scott Allen did a very informative TDD and MVC module, but to me he is doing BDDCompile and Execute Requirements in Microsoft .NET ~ Video from TechEd 2012 Communication I was working through a complicated task that the decision tree kept growing. After writing out the Given, When, Then of the scenario, I was able tell QA what I had worked through for their initial test cases. They were able to add from there. It is also useful to use this language with other developers, managers, or clients to help make informed decisions on if it meets the requirements or if it can simplified to save time (money). Thinking through solutions, before starting to code This was the biggest benefit to me. I like to jump into coding to figure out the problem. Many times I don't understand my path well enough and have to do some parts over. A past supervisor told me several times during reviews that I need to get better at seeing "the forest for the trees". When I sit down and write out the behavior that I need to implement, I force myself to think things out further and catch scenarios before they get to QA. A co-worker that is new to BDD and we’ve been using it in our new project for the last 6 months, said “It really clarifies things”. It took him awhile to understand it all, but now he’s seeing the value of this approach (yes there are some downsides, but that is a different issue). Developers’ Confidence This is huge for me. With tests in place, my confidence grows that I won’t break code that I’m not directly changing. In the past, I’ve worked on projects with out tests and we would frequently find regression bugs (or worse the users would find them). That isn’t fun. We don’t catch all problems with the tests, but when QA catches one, I can write a test to make sure it doesn’t happen again. It’s also good for Releasing code, telling your manager that it’s good to go. As time goes on and the code gets older, how confident are you that checking in code won’t break something somewhere else? Merging code - pre release confidence If you’re merging code a lot, it’s nice to have the tests to help ensure you didn’t merge incorrectly. Interrupted work I had a task that I started and planned out, then was interrupted for a month because of different priorities. When I started it up again, and un-shelved my changes, I had the BDD specs and it helped me remember what I had figured out and what was left to do. It would have much more difficult without the specs and tests. Testing and verifying complicated scenarios Sometimes in the UI there are scenarios that get tricky, because there are a lot of steps involved (click here to open the dialog, enter the information, make sure it’s valid, when I click cancel it should do {x}, when I click ok it should close and do {y}, then do this, etc….). With BDD I can avoid some of the mouse clicking define the scenarios and have them re-run quickly, without using a mouse. UI testing is still needed, but this helps a bunch. The same can be true for tricky server logic. Documentation of Assumptions and Specifications The BDD spec tests (Jasmine or SpecFlow or other tool) also work as documentation and show what the original developer was trying to accomplish. It’s not a different Word document, so developers will keep this up to date, instead of letting it become obsolete. What happens if you leave the project (consulting, new job, etc) with no specs or at the least good comments in the code? Sometimes I think of a new scenario, so I add a failing spec and continue in the same stream of thought (don’t forget it because it was on a piece of paper or in a notepad). Then later I can come back and handle it and have it documented. Jasmine tests and JavaScript –> help deal with the non-typed system I like JavaScript, but I also dislike working with JavaScript. I miss C# telling me if a property doesn’t actually exist at build time. I like the idea of TypeScript and hope to use it more in the future. I also use KnockoutJs, which has observables that need to be called with ending (), since the observable is a function. It’s hard to remember when to use () or not and the Jasmine specs/tests help ensure the correct usage.   This should give you an idea of the benefits that I see in using the BDD approach. I’m sure there are more. It talks a lot of practice, investment and experimentation to figure out how to approach this and to get comfortable with it. I agree with Scott Allen in the video I linked above “Remember that TDD can take some practice. So if you're not doing test-driven design right now? You can start and practice and get better. And you'll reach a point where you'll never want to get back.”

    Read the article

  • Game-a-Week 2 (The Sequel)

    - by Matt Christian
    After finishing Game-a-Week One I immediately wanted to go back and begin refactoring the code although I also wanted to work on a game demo idea I've had for quite awhile.  I tried doing both over the holiday weekend while up north (without internet!) and eventually hit a wall with an error. Today I am going to restart my refactoring and updates by starting Game-a-Week 2.  This challenge is to do the following: Refactor the old code Add a handful of new features to the demo This sounds simple enough but will be quite a challenge to finish in just around a week.  I have an idea on how I want to refactor the code, but the new features I'd like to implement will be tricky.  I'm going to try to implement: Quest giving / finishing / NPC's Quest Log Menu Inventory giving / receiving Inventory Menu This Game-a-Week is much more design oriented although will provide a good challenge for programming as well.  Wish me luck!

    Read the article

  • Accounts in Work Items after migration to TFS 2010 and to new domain

    - by Clara Oscura
    Lately I’ve been doing some tests on migrating our TFS 2008 installation to TFS 2010, coupled with a machine and domain change. One particular topic that was tricky is user accounts. We installed first a new machine with TFS 2010 and then migrated the projects in the old server. The work items were migrated with the projects. Great, but if I try to edit one of the old work items I cannot save it anymore because some fields contain old user names (ex. OLDDOMAIN\user) which are not known in the new domain (it should be NEWDOMAIN\user). The errors look like this: When I correct the ‘Assigned To’ field value, I get another error regarding another field: Before TFS 2010, we had TFSUsers power tool. It allow you to map an old user name to a new user name. This is not available anymore because WI fields with user accounts are now synchronized with AD display names changes (explained here). The correct way to go about this in TFS 2010 is to use TFSConfig Identities before adding the new domain accounts into the TFS groups (documented here). So, too late for us. I’ve found a (tedious) workaround to change those old account in work items in order to allow people to keep working with them. 1. Install TFS 2010 power tools 2. Export WIT from your project (VS | Tools | Process Editor | Work Item Types). Save the definition, for example: Original_MyProject_Task.xml 3. Copy the xml (NoReadOnly_MyProject_Task.xml) and edit it. From the field definition of ‘Activated By’, ‘Closed By’ and ‘Resolved By’, remove the following:        <WHENNOTCHANGED field="System.State">           <READONLY />         </WHENNOTCHANGED> 4. Import WIT in VS. Choose the new file (NoReadOnly_MyProject_Task.xml) and import it in MyProject 5. Open all tasks in Excel (flat list). Display the following columns: Asssigned To Activated By Closed By Resolved By Change the user accounts to the new ones (I usually sort each column alphabetically to make it easier). 6. Publish. If you get a conflict on a field, tough luck. You will have to manually choose “Local version” for each work item. I told you it was a tedious process. 7. Import original WIT (Original_MyProject_Task.xml) in MyProject. We only changed the WI definition so that we could change some fields. The original definition should be put back. And what about these other fields? Created By Authorized As These fields are not editable by definition (VS | Tools | Process Editor | Work Item Fields Explorer), even if they are not marked as read-only in the WIT. You can leave the old values. It doesn’t seem to matter to TFS. The other four fields are editable by definition, so only the WIT readonly rule prevents us from changing them. Technorati Tags: TFS,Team Foundation Server 2010,Work Item,Domain change

    Read the article

  • Yippy &ndash; the F# MVVM Pattern

    - by MarkPearl
    I did a recent post on implementing WPF with F#. Today I would like to expand on this posting to give a simple implementation of the MVVM pattern in F#. A good read about this topic can also be found on Dean Chalk’s blog although my example of the pattern is possibly simpler. With the MVVM pattern one typically has 3 segments, the view, viewmodel and model. With the beauty of WPF binding one is able to link the state based viewmodel to the view. In my implementation I have kept the same principles. I have a view (MainView.xaml), and and a ViewModel (MainViewModel.fs).     What I would really like to illustrate in this posting is the binding between the View and the ViewModel so I am going to jump to that… In Program.fs I have the following code… module Program open System open System.Windows open System.Windows.Controls open System.Windows.Markup open myViewModels // Create the View and bind it to the View Model let myView = Application.LoadComponent(new System.Uri("/FSharpWPF;component/MainView.xaml", System.UriKind.Relative)) :?> Window myView.DataContext <- new MainViewModel() :> obj // Application Entry point [<STAThread>] [<EntryPoint>] let main(_) = (new Application()).Run(myView) You can see that I have simply created the view (myView) and then created an instance of my viewmodel (MainViewModel) and then bound it to the data context with the code… myView.DataContext <- new MainViewModel() :> obj If I have a look at my viewmodel (MainViewModel) it looks like this… module myViewModels open System open System.Windows open System.Windows.Input open System.ComponentModel open ViewModelBase type MainViewModel() = // private variables let mutable _title = "Bound Data to Textbox" // public properties member x.Title with get() = _title and set(v) = _title <- v // public commands member x.MyCommand = new FuncCommand ( (fun d -> true), (fun e -> x.ShowMessage) ) // public methods member public x.ShowMessage = let msg = MessageBox.Show(x.Title) () I have exposed a few things, namely a property called Title that is mutable, a command and a method called ShowMessage that simply pops up a message box when called. If I then look at my view which I have created in xaml (MainView.xaml) it looks as follows… <Window xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" Title="F# WPF MVVM" Height="350" Width="525"> <Grid> <Grid.RowDefinitions> <RowDefinition Height="Auto"/> <RowDefinition Height="Auto"/> <RowDefinition Height="*"/> </Grid.RowDefinitions> <TextBox Text="{Binding Path=Title, Mode=TwoWay}" Grid.Row="0"/> <Button Command="{Binding MyCommand}" Grid.Row="1"> <TextBlock Text="Click Me"/> </Button> </Grid> </Window>   It is also very simple. It has a button that’s command is bound to the MyCommand and a textbox that has its text bound to the Title property. One other module that I have created is my ViewModelBase. Right now it is used to store my commanding function but I would look to expand on it at a later stage to implement other commonly used functions… module ViewModelBase open System open System.Windows open System.Windows.Input open System.ComponentModel type FuncCommand (canExec:(obj -> bool),doExec:(obj -> unit)) = let cecEvent = new DelegateEvent<EventHandler>() interface ICommand with [<CLIEvent>] member x.CanExecuteChanged = cecEvent.Publish member x.CanExecute arg = canExec(arg) member x.Execute arg = doExec(arg) Put this all together and you have a basic project that implements the MVVM pattern in F#. For me this is quite exciting as it turned out to be a lot simpler to do than I originally thought possible. Also because I have my view in XAML I can use the XAML designer to design forms in F# which I believe is a much cleaner way to go rather than implementing it all in code. Finally if I look at my viewmodel code, it is actually quite clean and compact…

    Read the article

  • Creating Parent-Child Relationships in SSRS

    - by Tim Murphy
    As I have been working on SQL Server Reporting Services reports the last couple of weeks I ran into a scenario where I needed to present a parent-child data layout.  It is rare that I have seen a report that was a simple tabular or matrix format and this report continued that trend.  I found that the processes for developing complex SSRS reports aren’t as commonly described as I would have thought.  Below I will layout the process that I went through to create a solution. I started with a List control which will contain the layout of the master (parent) information.  This allows for a main repeating report part.  The dataset for this report should include the data elements needed to be passed to the subreport as parameters.  As you can see the layout is simply text boxes that are bound to the dataset. The next step is to set a row group on the List row.  When the dialog appears select the field that you wish to group your report by.  A good example in this case would be the employee name or ID. Create a second report which becomes the subreport.  The example below has a matrix control.  Create the report as you would any parameter driven document by parameterizing the dataset. Add the subreport to the main report inside the row of the List control.  This can be accomplished by either dragging the report from the solution explorer or inserting a Subreport control and then setting the report name property. The last step is to set the parameters on the subreport.  In this case the subreport has EmpId and ReportYear as parameters.  While some of the documentation on this states that the dialog will automatically detect the child parameters, but this has not been my experience.  You must make sure that the names match exactly.  Tie the name of the parameter to either a field in the dataset or a parameter of the parent report. del.icio.us Tags: SQL Server Reporting Services,SSRS,SQL Server,Subreports

    Read the article

  • NEON Intrinsic Support in CE7

    - by Kate Moss' Open Space
    Just a side note for people who may be interested in creating high performance code to take advantage on NEON instruction set but wish to use NEON intrinsic instaed of coding assembly. Compiler won't generate NEON opcode unless application use the NEON intrinsic explicitly. Basically, you need ARMv7 build enviroment, so compiler can emit NEON opcode. Intrinsic prototype can be found in public\COMMON\sdk\inc\arm_neon.h and that is all you got. If you ever find an NEON opcode does not have corresponding intrinsic, you still need to use the old trick - write that part of code in assembly.

    Read the article

  • Combining javascript files

    - by Michael Freidgeim
    I’ve read Combining Client Scripts into a Composite Script and wanted to use it. Then I’ve read Julian Jelfs concerns ScriptManager.CompositeScript issues However the article Combining javascript files with Ajax toolkit library describes workarounds, that make the solution workable. You also can use Script reference profiler: http://aspnet.codeplex.com/releases/view/13356 Related posts: Using ScriptManager with other frameworks MSDN documentation: CompositeScriptReference he older implementations, that has been superseded by CompositeScript class: ToolkitScriptManager Combining, Compressing, Minifying ASP.NET ScriptResource and HTML Markups

    Read the article

  • Silverlight: Creating great UIs

    - by xamlnotes
    I was always told I was left brained and could not draw. And I bought into that view. Somewhere down the road years ago I did learn to play guitar and to play by ear at that.  Now that’s not all left brained so my right brain must be working.  About a year ago, my good friend Billy Hollis turned me own to a book by Betty Edwards (http://www.drawright.com/).  I started reading this and soon I found my self drawing on napkins in restaurants while we were waiting on food and at many other times too.  Dang’d if I could not draw! Check out my UI article at Dev Pro Connections (Great UIs article) on some of my experiences. Heres a few more links that are really cool too. Cool color combinations web site Simply painting is awesome. Saw this guy on tv. This site has some great tools for color contrasting

    Read the article

  • A Bite With No Teeth&ndash;Demystifying Non-Compete Clauses

    - by D'Arcy Lussier
    *DISCLAIMER: I am not a lawyer and this post in no way should be considered legal advice. I’m also in Canada, so references made are to Canadian court cases. I received a signed letter the other day, a reminder from my previous employer about some clauses associated with my employment and entry into an employee stock purchase program. So since this is in effect for the next 12 months, I guess I’m not starting that new job tomorrow. I’m kidding of course. How outrageous, how presumptuous, pompous, and arrogant that a company – any company – would actually place these conditions upon an employee. And yet, this is not uncommon. Especially in the IT industry, we see time and again similar wording in our employment agreements. But…are these legal? Is there any teeth behind the threat of the bite? Luckily, the answer seems to be ‘No’. I want to highlight two cases that support this. The first is Lyons v. Multari. In a nutshell, Dentist hires younger Dentist to be an associate. In their short, handwritten agreement, a non-compete clause was written stating “Protective Covenant. 3 yrs. – 5mi” (meaning you can’t set up shop within 5 miles for 3 years). Well, the young dentist left and did start an oral surgery office within 5 miles and within 3 years. Off to court they go! The initial judge sided with the older dentist, but on appeal it was overturned. Feel free to read the transcript of the decision here, but let me highlight one portion from section [19]: The general rule in most common law jurisdictions is that non-competition clauses in employment contracts are void. The sections following [19] explain further, and discuss Elsley v. J.G. Collins Insurance Agency Ltd. and its impact on Canadian law in this regard. The second case is Winnipeg Livestock Sales Ltd. v. Plewman. Desmond Plewman is an auctioneer, and worked at Winnipeg Livestock Sales. Part of his employment agreement was that he could not work for a competitor for 18 months if he left the company. Well, he left, and took up an important role in a competing company. The case went to court and as with Lyons v. Multari, the initial judge found in favour of the plaintiffs. Also as in the first case, that was overturned on appeal. Again, read through the transcript of the decision, but consider section [28]: In other words, even though Plewman has a great deal of skill as an auctioneer, Winnipeg Livestock has no proprietary interest in his professional skill and experience, even if they were acquired during his time working for Winnipeg Livestock.  Thus, Winnipeg Livestock has the burden of establishing that it has a legitimate proprietary interest requiring protection.  On this key question there is little evidence before the Court.  The record discloses that part of Plewman’s job was to “mingle with the … crowd” and to telephone customers and prospective customers about future prospects for the sale of livestock.  It may seem reasonable to assume that Winnipeg Livestock has a legitimate proprietary interest in its customer connections; but there is no evidence to indicate that there is any significant degree of “customer loyalty” in the business, as opposed to customers making choices based on other considerations such as cost, availability and the like. So are there any incidents where a non-compete can actually be valid? Yes, and these are considered “exceptional” cases, meaning that the situation meets certain circumstances. Michael Carabash has a great blog series discussing the above mentioned cases as well as the difference between a non-compete and non-solicit agreement. He talks about the exceptional criteria: In summary, the authorities reveal that the following circumstances will generally be relevant in determining whether a case is an “exceptional” one so that a general non-competition clause will be found to be reasonable: - The length of service with the employer. - The amount of personal service to clients. - Whether the employee dealt with clients exclusively, or on a sustained or     recurring basis. - Whether the knowledge about the client which the employee gained was of a   confidential nature, or involved an intimate knowledge of the client’s   particular needs, preferences or idiosyncrasies. - Whether the nature of the employee’s work meant that the employee had   influence over clients in the sense that the clients relied upon the employee’s   advice, or trusted the employee. - If competition by the employee has already occurred, whether there is   evidence that clients have switched their custom to him, especially without   direct solicitation. - The nature of the business with respect to whether personal knowledge of   the clients’ confidential matters is required. - The nature of the business with respect to the strength of customer loyalty,   how clients are “won” and kept, and whether the clientele is a recurring one. - The community involved and whether there were clientele yet to be exploited   by anyone. I close this blog post with a final quote, one from Zvulony & Co’s blog post on this subject. Again, all of this is not official legal advice, but I think we can see what all these sources are pointing towards. To answer my earlier question, there’s no teeth behind the threat of the bite. In light of this list, and the decisions in Lyons and Orlan, it is reasonably certain that in most employment situations a non-competition clause will be ineffective in protecting an employer from a departing employee who wishes to compete in the same business. The Courts have been relatively consistent in their position that if a non-solicitation clause can protect an employer’s interests, then a non-competition clause is probably unreasonable. Employers (or their solicitors) should avoid the inclination to draft restrictive covenants in broad, catch-all language. Or in other words, when drafting a restrictive covenant – take only what you need! D

    Read the article

  • VBUG Spring Conference, 28th and 29th March in Reading

    - by Eric Nelson
    I presented at VBUG last year and can confirm that they put on a really good event. This year I stood aside for my “replacement” Steve Plank to work his magic. Worth checking out… VBUG SPRING CONFERENCE 28/29 March 2011 Wokefield Park, Mortimer, Reading RG7 3AH Day One (Mon 28 March): Developing SharePoint 2010 with Visual Studio 2010 - Dave McMahon Cache Out with Windows Server AppFabric – Phil Pursglove Extending your Corporate Network in to the Windows Azure Data Centre with Windows Azure Connect – Steve Plank Silverlight Development on Windows Phone 7 - Andy Wigley Day Two (Tues 29 March): Self Service BI for your users, but what does that mean for you? - Andrew Fryer Design Patterns – Compare and Contrast – Gary Short Projecting your corporate identity to the cloud – Steve Plank May the Silverlight 4 be with you – Richard Costall The Step up to ALM – an Introduction to Visual Studio 2010 TFS for the Visual Sourcesafe User - Richard Fennell For more information go to http://cms.vbug.net (It isn’t free but it is high quality)

    Read the article

  • FREE Three Days of online SharePoint 2010 Developer Training March 14th to 16th

    - by Eric Nelson
    Over on my team blog I just posted another great opportunity. If you are UK based and work for a company that creates software products and want to dig into SharePoint 2010 development for FREE with a great UK based SME (subject matter expert) then register today. The training is 100% free and you don’t need to leave the comfort of your office/house/starbucks (other coffee shops with wifi do exist) – yet you still get to ask plenty of questions.

    Read the article

  • Ch-ch-ch-changes...

    - by Lou Vega
    The last few months have been pretty crazy. Just before the MVP summit in February I was approached about changing to a different project with my (then current) employer, and right after the summit I was approached by another company. Eventually I went with the new company and a new role in the Information Assurance field. More to come on that as things progress. All that being said I've not been as active in the .NET community as I once was and I miss it - so I'm looking to dive back in especially as Windows Phone 7 draws nearer and nearer. Speaking of the community - many of you may not recognize me if you see me now :) I had told my son for the last couple years that I would cut my hair before he turned 5 (he always asked how come he didn't have long hair) and he turns 5 (time has flown!) on June 19th so May 30th I cut my long hair down pretty short and donated the hair to Locks of Love. As Chris said to me on Twitter, "pics or it didn't happen" - well fortunately my wife was there to document the whole thing so I'll get a picture or two posted here soon.

    Read the article

  • Function Folding in #PowerQuery

    - by Darren Gosbell
    Originally posted on: http://geekswithblogs.net/darrengosbell/archive/2014/05/16/function-folding-in-powerquery.aspxLooking at a typical Power Query query you will noticed that it's made up of a number of small steps. As an example take a look at the query I did in my previous post about joining a fact table to a slowly changing dimension. It was roughly built up of the following steps: Get all records from the fact table Get all records from the dimension table do an outer join between these two tables on the business key (resulting in an increase in the row count as there are multiple records in the dimension table for each business key) Filter out the excess rows introduced in step 3 remove extra columns that are not required in the final result set. If Power Query was to execute a query like this literally, following the same steps in the same order it would not be overly efficient. Particularly if your two source tables were quite large. However Power Query has a feature called function folding where it can take a number of these small steps and push them down to the data source. The degree of function folding that can be performed depends on the data source, As you might expect, relational data sources like SQL Server, Oracle and Teradata support folding, but so do some of the other sources like OData, Exchange and Active Directory. To explore how this works I took the data from my previous post and loaded it into a SQL database. Then I converted my Power Query expression to source it's data from that database. Below is the resulting Power Query which I edited by hand so that the whole thing can be shown in a single expression: let     SqlSource = Sql.Database("localhost", "PowerQueryTest"),     BU = SqlSource{[Schema="dbo",Item="BU"]}[Data],     Fact = SqlSource{[Schema="dbo",Item="fact"]}[Data],     Source = Table.NestedJoin(Fact,{"BU_Code"},BU,{"BU_Code"},"NewColumn"),     LeftJoin = Table.ExpandTableColumn(Source, "NewColumn"                                   , {"BU_Key", "StartDate", "EndDate"}                                   , {"BU_Key", "StartDate", "EndDate"}),     BetweenFilter = Table.SelectRows(LeftJoin, each (([Date] >= [StartDate]) and ([Date] <= [EndDate])) ),     RemovedColumns = Table.RemoveColumns(BetweenFilter,{"StartDate", "EndDate"}) in     RemovedColumns If the above query was run step by step in a literal fashion you would expect it to run two queries against the SQL database doing "SELECT * …" from both tables. However a profiler trace shows just the following single SQL query: select [_].[BU_Code],     [_].[Date],     [_].[Amount],     [_].[BU_Key] from (     select [$Outer].[BU_Code],         [$Outer].[Date],         [$Outer].[Amount],         [$Inner].[BU_Key],         [$Inner].[StartDate],         [$Inner].[EndDate]     from [dbo].[fact] as [$Outer]     left outer join     (         select [_].[BU_Key] as [BU_Key],             [_].[BU_Code] as [BU_Code2],             [_].[BU_Name] as [BU_Name],             [_].[StartDate] as [StartDate],             [_].[EndDate] as [EndDate]         from [dbo].[BU] as [_]     ) as [$Inner] on ([$Outer].[BU_Code] = [$Inner].[BU_Code2] or [$Outer].[BU_Code] is null and [$Inner].[BU_Code2] is null) ) as [_] where [_].[Date] >= [_].[StartDate] and [_].[Date] <= [_].[EndDate] The resulting query is a little strange, you can probably tell that it was generated programmatically. But if you look closely you'll notice that every single part of the Power Query formula has been pushed down to SQL Server. Power Query itself ends up just constructing the query and passing the results back to Excel, it does not do any of the data transformation steps itself. So now you can feel a bit more comfortable showing Power Query to your less technical Colleagues knowing that the tool will do it's best fold all the  small steps in Power Query down the most efficient query that it can against the source systems.

    Read the article

  • WSS 3.0 to SharePoint 2010: Tips for delaying the Visual Upgrade

    - by Kelly Jones
    My most recent project has been to migrate a bunch of sites from WSS 3.0 (SharePoint 2007) to SharePoint Server 2010.  The users are currently working with WSS 3.0 and Office 2003, so the new ribbon based UI in 2010 will be completely new.  My client wants to avoid the new SharePoint 2010 look and feel until they’ve had time to train their users, so we’ve been testing the upgrades by keeping them with the 2007 user interface. Permission to perform the Visual Upgrade One of the first things we noticed was the default permissions for who was allowed to switch the UI from 2007 to 2010.  By default, site collection administrators and site owners can do this.  Since we wanted to more tightly control the timing of the new UI, I added a few lines to the PowerShell script that we are using to perform the migration.  This script creates the web application, sets the User Policy, and then does a Mount-SPDatabase to attach the old 2007 content database to the 2010 farm.  I added the following steps after the Mount-SPDatabase step: #Remove the visual upgrade option for site owners # it remains for Site Collection administrators foreach ($sc in $WebApp.Sites){ foreach ($web in $sc.AllWebs){ #Visual Upgrade permissions for the site/subsite (web) $web.UIversionConfigurationEnabled = $false; $web.Update(); } } These script steps loop through each Site Collection in a particular web application ($WebApp) and then it loops through each subsite ($web) in the Site Collection ($sc) and disables the Site Owner’s permission to perform the Visual Upgrade. This is equivalent to going to the Site Collection administrator settings page –> Visual Upgrade and selecting “Hide Visual Upgrade”. Since only IT people have Site Collection administrator privileges, this will allow IT to control the timing of the new 2010 UI rollout. Newly created subsites Our next issue was brought to our attention by SharePoint Joel’s blog post last week (http://www.sharepointjoel.com/Lists/Posts/Post.aspx?ID=524 ).  In it, he lists some updates about the 2010 upgrade, and his fourth point was one that I hadn’t seen yet: 4. If a 2007 upgraded site has not been visually upgraded, the sites created underneath it will look like 2010 sites – While this is something I’ve been aware of, I think many don’t realize how this impacts common look and feel for master pages, and how it impacts good navigation and UI. As well depending on your patch level you may see hanging behavior in the list picker. The site and list creation Silverlight control in Internet Explorer is looking for resources that don’t exist in the galleries in the 2007 site, and hence it continues to spin and spin and eventually time out. The work around is to upgrade to SP1, or use Chrome or Firefox which won’t attempt to render the Silverlight control. When the root site collection is a 2007 site and has it’s set of galleries and the children are 2010 sites there is some strange behavior linked to the way that the galleries work and pull from the parent. Our production SharePoint 2010 Farm has SP1 installed, as well as the December 2011 Cumulative Update, so I think the “hanging behavior” he mentions won’t affect us. However, since we want to control the roll out of the UI, we are concerned that new subsites will have the 2010 look and feel, no matter what the parent site has. Ok, time to dust off my developer skills. I first looked into using feature stapling, but I couldn’t get that to work (although I’m pretty sure I had everything wired up correctly).  Then I stumbled upon SharePoint 2010’s web events – a great way to handle this. Using Visual Studio 2010, I created a new SharePoint project and added a Web Event Receiver: In the Event Receiver class, I used the WebProvisioned method to check if the parent site is a 2007 site (UIVersion = 3), and if so, then set the newly created site to 2007:   /// <summary> /// A site was provisioned. /// </summary> public override void WebProvisioned(SPWebEventProperties properties) { base.WebProvisioned(properties);   try { SPWeb curweb = properties.Web;   if (curweb.ParentWeb != null) {   //check if the parent website has the 2007 look and feel if (curweb.ParentWeb.UIVersion == 3) { //since parent site has 2007 look and feel // we'll apply that look and feel to the current web curweb.UIVersion = 3; curweb.Update(); } } } catch (Exception) { //TODO: Add logging for errors } }   This event is part of a Feature that is scoped to the Site Level (Site Collection).  I added a couple of lines to my migration PowerShell script to activate the Feature for any site collections that we migrate. Plan Going Forward The plan going forward is to perform the visual upgrade after the users for a particular site collection have gone through 2010 training. If we need to do several site collections at once, we’ll use a PowerShell script to loop through each site collection to update the sites to 2010.  If it’s just one or two, we’ll be using the “Update All Sites” button on the Visual Upgrade page for Site Collection Administrators. The custom code for newly created sites won’t need to be changed, since it relies on the UI version of the parent site.  If the parent is 2010, then the new site will look 2010.

    Read the article

  • Scrum for Team System v3 RTM available

    - by Enrique Lima
    If you are using the Scrum for Team System (aka Conchango Scrum Template), it has hit RTM and it is available for download.  First you will need to register and then you will ne able to access the goods. There is also a very well laid out Getting Started with v3 guide, put together by Crispin Parker. Note:  Very important to know and consider, is the fact there is no upgrade or migration strategy laid out from v2.2.  So you are left to your own devices on that one.  There are plenty of discussions going on as far as making it happen.  Don’t get me wrong v2.2 work items will be present and you will be able to use them, just don’t expect the v3 template to trigger when  adding new work items. For now, get TFS in place, install the template and start fresh.  The Workbench should be released soon too, and that makes it a great component to this solution.

    Read the article

  • SharePoint Saturday DC 2010 Slides, Demo Scripts, and Pictures

    - by Brian Jackett
    Wow! This past weekend I attended SharePoint Saturday Washington DC (SPSDC) which was quite an event to say the least.  For those unfamiliar, SharePoint Saturday is a community driven event where various speakers gather to present at a FREE conference on all topics related to SharePoint.  This made my fifth SharePoint Saturday attended and fourth I’ve spoken at.  SPSDC was a bit different than most SharePoint Saturdays mostly due to the scale of it.  We had almost 950 attendees, over 80 speakers presenting close to 90 sessions, and dozens of sponsors.  A big thanks goes out to the organizers of this event.  They put in a lot of hard work and time to pull this event off and should be very proud of the end result.      For SPSDC I presented “The Power of PowerShell + SharePoint 2007”.  I want to thank all of the attendees of my session for coming and asking some great questions.  Below you can find the slides and demo scripts for this session.  I also took some photos throughout the day (not as many as usual since so much going on) so check them out.  If you have any follow up questions feel free to drop me a line in the comments or the contact link at the top of the site.   Slides and Scripts Click here for the demo scripts and slides posted on my SkyDrive. VERY IMPORTANT NOTE: One thing I forgot to mention in my presentation.  In order to run code against the SharePoint API you need to load the Microsoft.SharePoint.dll assembly first.  Run the below command on the PowerShell console line to complete that:   [void][System.Reflection.Assembly]::LoadWithPartialName("Microsoft.SharePoint") Photos Facebook album -or- My album on Windows Live site (higher res shots). View Full Album         -Frog Out

    Read the article

  • Sometimes you have to brag on your employer

    - by Mickey Gousset
    A lot of you know me as an Application Lifecycle Management MVP, and a huge proponent of ALM, TFS, and Visual Studio.  For my day job, however, I work for Infront Consulting Group, a System Center consulting and training organization.  I love what I do there, and work closely with Operations Manager, Service Manager, and Orchestrator.  And believe it or not, use a lot of ALM best practices around all of those. Infront was just recognized as a 2012 Microsoft Corporate Account Virtualization Data Center Services Partner of the Year.  This award recognizes a solution partner that has demonstrated leadership and commitment in driving Microsoft virtualization solutions in the Microsoft Corporate Account segment.  I’m very proud of Infront, and all the hard work that everyone here has put into the incredible services we provide, which lead to us winning this award.

    Read the article

  • WCF HttpClient Error calling a RESTful WCF Service - Cannot write more bytes to the buffer than the configured maximum buffer size: 65536

    - by Justin Hoffman
    Using the HttpClient API from wcf.codeplex.com, you may encounter this error if respones are too large.   Cannot write more bytes to the buffer than the configured maximum buffer size: 65536 In order to increase the size of the Response Buffer, just increase the MaxReseponseContentBufferSize as shown below. Increase it to something larger than the default: 65536 depending on your response sizes. var client = new HttpClient { MaxResponseContentBufferSize = 196608, BaseAddress = new Uri("http://myservice/service1/") };

    Read the article

  • F# Simple Twitter Update

    - by mroberts
    A short while ago I posted some code for a C# twitter update.  I decided to move the same functionality / logic to F#.  Here is what I came up with. 1: namespace Server.Actions 2:   3: open System 4: open System.IO 5: open System.Net 6: open System.Text 7:   8: type public TwitterUpdate() = 9: 10: //member variables 11: [<DefaultValue>] val mutable _body : string 12: [<DefaultValue>] val mutable _userName : string 13: [<DefaultValue>] val mutable _password : string 14:   15: //Properties 16: member this.Body with get() = this._body and set(value) = this._body <- value 17: member this.UserName with get() = this._userName and set(value) = this._userName <- value 18: member this.Password with get() = this._password and set(value) = this._password <- value 19:   20: //Methods 21: member this.Execute() = 22: let login = String.Format("{0}:{1}", this._userName, this._password) 23: let creds = Convert.ToBase64String(Encoding.ASCII.GetBytes(login)) 24: let tweet = Encoding.ASCII.GetBytes(String.Format("status={0}", this._body)) 25: let request = WebRequest.Create("http://twitter.com/statuses/update.xml") :?> HttpWebRequest 26: 27: request.Method <- "POST" 28: request.ServicePoint.Expect100Continue <- false 29: request.Headers.Add("Authorization", String.Format("Basic {0}", creds)) 30: request.ContentType <- "application/x-www-form-urlencoded" 31: request.ContentLength <- int64 tweet.Length 32: 33: let reqStream = request.GetRequestStream() 34: reqStream.Write(tweet, 0, tweet.Length) 35: reqStream.Close() 36:   37: let response = request.GetResponse() :?> HttpWebResponse 38:   39: match response.StatusCode with 40: | HttpStatusCode.OK -> true 41: | _ -> false   While the above seems to work, it feels to me like it is not taking advantage of some functional concepts.  Love to get some feedback as to how to make the above more “functional” in nature.  For example, I don’t like the mutable properties.

    Read the article

  • Windows Azure Platform TCO/ROI Analysis Tool

    - by kaleidoscope
    Microsoft have released a tool to help you figure out how much money you can save by switching to Windows Azure from your on-premises solution. The tool will provide you with a customized estimate of potential cost savings you (or your company or organization) may achieve by building on the Windows Azure Platform. Upon completion of the TCO and ROI Calculator profile analysis, you will be presented with a detailed report which shows estimated line item costs for an accurate TCO and a 1 to 3 year ROI analysis for you or your company or organization. You should not interpret the analysis report you receive as a part of this process to be a commitment on the part of Microsoft, and Microsoft makes no guarantees regarding the accuracy of any information presented in the report. You should not view the results of this report as a substitute for engaging with a third party expert to independently evaluate you or your company’s specific computing needs. The analysis report you will receive is for informational purposes only. For more information check this link. Geeta, G

    Read the article

  • Silverlight Goes Mobile!

    - by PeterTweed
    The most exciting announcements from Mix 2010 last week for me were the release of the Windows Phone 7 Series SDK and the news that the platform utilizes Silverlight for the application development technology. From the press and exposure that the platform is being given and the experience that is promised it looks like the Windows Phone 7 Series could eventually compete with the iPhone. For me this is exciting as Silverlight can now be used to develop RIA apps, easily deployed desktop apps and mobile apps. As someone who delivers enterprise technology solutions this equates to a whole bunch of opportunity knocking at the door and asking to join the party. Watch this space for future posts on developing apps on the Windows Phone 7 Series platform!

    Read the article

< Previous Page | 119 120 121 122 123 124 125 126 127 128 129 130  | Next Page >