Search Results

Search found 6 results on 1 pages for 'aoven'.

Page 1/1 | 1 

  • When does a WPF adorner layer first become available?

    - by aoven
    I'm trying to add an overlay effect to my UserControl and I know that's what adorners are used for in WPF. But I'm a bit confused about how they supposedly work. I figured that adorner layer is implicitly handled by WPF runtime, and as such, should always be available. But when I create an instance of my UserControl in code, there is no adorner layer there. The following code fails with exception: var view = new MyUserControl(); var target = view.GetAdornerTarget(); // This returns a specific UI control. var layer = AdornerLayer.GetAdornerLayer(target); if (layer == null) { throw new Exception("No adorner layer at the moment."); } Can someone please explain to me, how this is supposed to work? Do I need to place the UserControl instance into a top-level Window first? Or do I need to define the layer myself somehow? Digging through documentation got me nowhere. Thank you!

    Read the article

  • Problem with DataTrigger binding - setters are not being called

    - by aoven
    I have a Command bound to a Button in XAML. When executed, the command changes a property value on the underlying DataContext. I would like the button's Content to reflect the new value of the property. This works*: <Button Command="{x:Static Member=local:MyCommands.TestCommand}" Content="{Binding Path=TestProperty, Mode=OneWay}" /> But this doesn't: <Button Command="{x:Static Member=local:MyCommands.TestCommand}"> <Button.Style> <Style TargetType="{x:Type Button}"> <Style.Triggers> <DataTrigger Binding="{Binding Path=TestProperty, Mode=OneWay}" Value="True"> <DataTrigger.Setters> <Setter Property="Content" Value="Yes"/> </DataTrigger.Setters> </DataTrigger> <DataTrigger Binding="{Binding Path=TestProperty, Mode=OneWay}" Value="False"> <DataTrigger.Setters> <Setter Property="Content" Value="No"/> </DataTrigger.Setters> </DataTrigger> </Style.Triggers> </Style> </Button.Style> </Button> Why is that? * By "works" I mean the Content gets updated whenever I click the button. TIA

    Read the article

  • How to set Height of items in XAML so they always occupy the same proportion of available space in p

    - by aoven
    I have an ItemsControl with the following ItemTemplate: <DataTemplate x:Key="myItemTemplate"> <TextBlock Height="???" Text="{Binding Path=Description}" /> </DataTemplate> My question is, how do I set the Height of the TextBlock in the template so that it automatically assumes ItemsControl.Height div ItemsCount amount of vertical space? When there's only one item, I'd like it to be the full height of container, when there're two, each should be half the size, and so on. If possible, I'd prefer to do this completely in XAML to keep my ViewModel clean of UI logic.

    Read the article

  • Is it possible to have CommandManager requery only specific WPF command instead of all?

    - by aoven
    I'm trying to implement a highly responsive UI for my MVVM application, so I've chosen to have all command handlers automatically execute in a BackgroundWorker so they don't block the UI. But at the same time, I don't want the user to be able to execute the same command while it is still executing in the background. The solution seems obvious: When Executed handler is invoked, have the CanExecute handler return false Start the BackgroundWorker async When BackgroundWorker finishes, have the CanExecute handler return true again. Problem is, I need to notify WPF after step 1 and again after step 3 that CanExecute has changed and that it should be required. I know I can do it by calling CommandManager.InvalidateRequerySuggested, but that causes CanExecute handlers of all other commands to be requeried as well. Having a lot of commands, that's not good. Is there a way to ask for requery of a specific command - i.e. the one that is currently being executed? TIA

    Read the article

  • Clarification needed: How does .NET runtime resolve assembly references from parent folder?

    - by aoven
    I have the following output structure of executables in my solution: %ProgramFiles% | +-[MyAppName] | +-[Client] | | | +-(EXE & several DLL assemblies) | +-[Common] | | | +-[Schema Assemblies] | | | | | +-(several DLL assemblies) | | | +-(several DLL assemblies) | +-[Server] | +-(EXE & several DLL assemblies) Each project in solution references different DLL assemblies, some of which are outputs from other projects in solution, and others are plain 3rd-party assemblies. For example, [Client] EXE might reference an assembly in [Common], which is in a different directory branch. All references have "Copy Local" set to false, to mirror the layout of the files in the final installed application. Now, if I take a look at reference properties in the Visual Studio IDE, I see that "Path" of every reference is absolute and that it corresponds to the actual output location of the assembly. That's understandable and correct. As expected, solution compiles and runs just fine. What I don't understand is, why everything seems to work even when I close the IDE, rename the [MyAppName] directory and run the [Client] EXE manually? How does the runtime find the assemblies if the reference paths aren't the same as they were at the time of linking? To be clear - this is actually exactly what I'm after: a semi-dispersed set of application files that run fine regardless of where the [MyAppName] directory is located or even what it's named. I'd just like to know, how and why this works without any specific path resolution on my part. I've read the answers to this similar question, but I still don't get it. Help much appreciated!

    Read the article

  • Systems design question: DB connection management in load-balanced n-tier

    - by aoven
    I'm wondering about the best approach to designing a DB connection manager for a load-balanced n-tier system. Classic n-tier looks like this: Client -> BusinessServer -> DBServer A load-balancing solution as I see it would then look like this: +--> ... +--+ +--> BusinessServer +--+--> SessionServer --+ Client -> Gateway --+--> BusinessServer +--| +--> DBServer +--> BusinessServer +--+--------------------+ +--> ... +--+ As pictured, the business server component is being load-balanced via multiple instances, and a hardware gateway is distributing the load among them. Session server probably needs to be situated outside the load-balancing array, because it manages state, which mustn't be duplicated. Barring any major errors in design so far, what is the best way to implement DB connection management? I've come up with a couple of options, but there may be others I'm not aware of: Introduce a new Broker component between the DBServer and the other components and let it handle the DB connections. The upside is that all the connections can be managed from a single point, which is very convenient. The downside is that now there is an additional "single point of failure" in the system. Other components must go through it for every request that involves DB in some way, which also makes this a bottleneck. Move the DB connection management into BusinessServer and SessionServer components and let each handle its own DB connections. The upside is that there is no additional "single point of failure" or bottleneck components. The downside is that there is also no control over possible conflicts and deadlocks apart from what DBServer itself can provide. What else can be done? FWIW: Technology is .NET, but none of the vendor-specific stacks are used (e.g. no WCF, MSMQ or the like).

    Read the article

1