Search Results

Search found 2334 results on 94 pages for 'unity'.

Page 19/94 | < Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >

  • Installed nvidia driver, activated it, and now Unity is gone. No bars, menus, nothing

    - by Noel
    I installed the nvidia driver (installed the ubuntu-x-swat ones, updated them, got the updates for them, installed bumblebee. I restarted everytime I did those steps, so no, i don't simply need to 'restart X'. I tried to run things using bumblebee, but bumblebee was like "can't access GPU driver". So I ran nvidia-settings, it said the drivers weren't in use, so I ran "sudo nvidia-xconfig", then restarted. Now, my login screen looks differently than it did before: it asks me if I want to load: "GNOME, GNOME - no effects, Cairo Dock - GNOME, System Default, or Ubuntu" when I log in, but WORST OF ALL: i no longer have any kind of GNOME/unity GUI. There are no title bars above any windows, no close/minimize/maximize buttons. The unity bar is gone, and will not show up when I call it. And the top status bar is also no longer there.

    Read the article

  • How can I edit/create new launcher items in Unity by hand?

    - by Ike
    Will Unity allow making custom launcher icons from .desktop files or via menu editing system? (Right now the launcher doesn't give the option to "keep in launcher" on all programs. For some programs I use, I have to make custom launchers or .desktop files. For instance, daily blender builds are generally just folders with an executable. In basic gnome or kde, I can make a new menu entry with the menu editing system. Then, I can also add it to docky either from the menu or by dragging a .desktop file to it. Unity launcher doesn't support drag and drop, so thats not a bug or anything, but when i open a .desktop file, it has unpredictable results. Most time it will not have"keep in launcher". Sometime it will have a pinnable item without the .desktop's icon, and if i pin the item to the launcher, it will not call upon the program again after closing it. I've also gotten it to just work with a .desktop file for "celtx".

    Read the article

  • Will Unity allow users to change the color/appearance of the top-Panel?

    - by Sam
    I'm very excited by the functional design principles and keyboard shortcuts that are being implemented for Unity. And function of use is more important to me than looks. However, after experiencing the aesthetic beauty of the display of the top panel in gnome-shell, I was wondering if users would be able to alter the color of Unity's top panel? IMHO it does not look as good as the gnome-shell implementation (or mac OS X/iPad). I think if an alternate color/appearance were chosen for the panel, it would make a big difference aesthetically. Is there a way to make it Black like gnome-shell? Or are the color choices limited to theme-designs as pointed out in this answer? For efficiency and clarity, the Panel should be better differentiated from application controls. The panel should be a different color because it has a "constant (always present) state," unlike application controls. For contrast and easy-recognition, I would like to make the Panel black (like gnome-shell) but make the application controls (e.g., those of Firefox) "Inverted"

    Read the article

  • Unity-webapps: what to do with the downloaded file?

    - by user104293
    I have installed the unity-webapps package in Ubuntu 12.04. (sudo apt-get install unity-webapps) Now I want to see what the fuss is about, so I go to http://bazaar.launchpad.net/~webapps/webapps-applications/trunk/files/head:/src/ and click on one of the options (e.g. Google Calendar). I download one of the files (GoogleCalendar.user.js). What do I do with this file? Is my webapps working? This is not what I was expecting to happen.

    Read the article

  • How are you coping with Ubuntu's Unity app launcher? (It auto-hides, can't minimize apps)

    - by Bad Learner
    [Firstly, let me tell you that this cannot be subjective in anyway, as I think at least Ubuntu beginners will have these questions boggling in their mind; and yes, this is a question that has a definite answer - - so, I am completely within the rules.] Okay, coming to the point, I see that Ubuntu uses Unity since v10.xx (netbook edition?) and carried the same to v11.04 & v11.10. As someone who's stuck to Windows for all these years, it's somewhat difficult to cope with Ubuntu's Unity, for the following reasons: [1] The Unity app launcher (to the screen's left) auto-hides when a window is maximized. [2]- And once launched, apps cannot be minimized by clicking the app's icon in the launcher. I have to go to the top-left of the screen and click the "_" button. I do know I can fix these issues by installing some configuration tool. But the thing is, if that's how it's meant to work, Canonical/Ubuntu would have designed it that way. But they didn't. Why? w.r.t above points [1], [2]: [1] EDITED: So, does it mean, it's good to work without maximizing the windows? Because if I maximize the window, the app launcher hides. And I need to hover the mouse to the left of the screen, wait a bit (even if it's a sec or even less, I can still feel the lag), and then click on the next app icon in the launcher to switch to it. I do know, I can use Alt+TAB to switch, but I am not sure which window comes next. This, I feel, isn't productive. Also, this makes me feel, Ubuntu is designed for large screens (it's nice on my 1920x1080p screen), because I can have two windows side-by-side or something like that on a large screen. This is not possible on smaller screens. [2]- Being able to minimize an application's window by clicking on its icon in the launcher (just like it works on Windows & probably elsewhere) would have been great, rather than having to go to the top-left and clicking the _ (minimize) button which brings up the app launcher itself (from hiding) most of the time. It's too tiring to have these small issues in the UI. I really would like to know how you are coping with these issues the way they are?

    Read the article

  • How can I edit/create new launcher items in Unity?

    - by Ike
    Will Unity allow making custom launcher icons from .desktop files or via menu editing system? (Right now the launcher doesn't give the option to "keep in launcher" on all programs. For some programs I use, I have to make custom launchers or .desktop files. For instance, daily blender builds are generally just folders with an executable. In basic gnome or kde, I can make a new menu entry with the menu editing system. Then, I can also add it to docky either from the menu or by dragging a .desktop file to it. Unity launcher doesn't support drag and drop, so thats not a bug or anything, but when i open a .desktop file, it has unpredictable results. Most time it will not have"keep in launcher". Sometime it will have a pinnable item without the .desktop's icon, and if i pin the item to the launcher, it will not call upon the program again after closing it. I've also gotten it to just work with a .desktop file for "celtx".

    Read the article

  • Starting multiple applications in Ubuntu Unity

    - by Black
    I would like to start multiple GUI applications with a single script or command in Ubuntu 12. By now, I have a shell script that starts an application in the foreground and waits for the termination of the application afterwards starts several applications (like browser, mailer, IRC client) in the background The script is working, however all the applications are getting the same icon and are treated like different windows of one application, i.e. the script. Is there a way to start applications from a script, that makes Unity display the icons of the applications, e.g. the Thunderbird icon, instead of a single default icon for the script? The script looks like this: ! /bin/bash wait for termination... /usr/bin/libreoffice path/to/document in background /usr/bin/thunderbird & /usr/bin/pidgin &

    Read the article

  • How to get the Dash and HUD to appear. (and stop Unity spewing error messages.)

    - by Ubuntiac
    I just installed Ubuntu 12.04 on my wifes Dell Inspiron 1501, which uses an R300 ATI graphics chip. Neither the Dash or HUD appear when pushing the appropriate key. When I try unity --reset & in the terminal, I see that over and over it's spitting out: r300: CS space validation failed. (not enough memory?) Skipping rendering. This is just after starting Ubuntu with no apps open, so I find it hard to believe that just rendering the Dash / HUD is completely blowing out the VRAM. Any suggestions on getting this working? /usr/lib/nux/unity_support_test -p shows OpenGL vendor string: X.Org R300 Project OpenGL renderer string: Gallium 0.4 on ATI RS480 OpenGL version string: 2.1 Mesa 8.0.2 Not software rendered: yes Not blacklisted: yes GLX fbconfig: yes GLX texture from pixmap: yes GL npot or rect textures: yes GL vertex program: yes GL fragment program: yes GL vertex buffer object: yes GL framebuffer object: yes GL version is 1.4+: yes Unity 3D supported: yes All sections say "YES"

    Read the article

  • Is there any performance difference between Ubuntu Unity and Classic/Fallback?

    - by user48949
    is there any difference between using Ubuntu Unity and Ubunt Classic/Fallback? Just to be clear, I'm not talking about the Launcher or the Dash. Of course Ubuntu Classic/Fallback doesn't have the Launcher/Dash, but this is not the difference I'm talking about. I mean differences related to performance, features, functionalities, compatibilities, etc. These kinds of differences. I'm asking this because I've heard the Fallback Mode is kind of "incomplete" when it's compared to Gnome Shell or Ubuntu Unity, so I just wanted to know whether or not it's true, because if it's true, I don't think using Fallback Mode is worth it.

    Read the article

  • I just updated to 12.04 from 11.10 and now the Unity bar is very sluggish and even the menus do not show on the panel

    - by jredkai
    Like the title says, I just upgraded to 12.04, I know it is only beta but I'm having problems with the Unity bar and the panel with the menus. First the Unity bar is acting very sluggish..I really don't know how to explain it. Also with the menu at the top, it keeps a strange little black bar on everything, even across my tabs in Firefox, it also doesn't let me see any of the menus, well they flicker and you sort of have to guess where everything is. I'm at a loss, I've already ran updates a couple of times to see if I missed something. I really hope I don't have to completely redo my whole Ubuntu setup. Any help would be greatly appreciated.

    Read the article

  • Why do I have Unity and GNOME Classic running at the same time?

    - by jerowin
    I've recently updated and found that Unity shows on the gnome-classic/fallback desktop. Is this a new feature (which I doubt to be), or a bug? If it is a bug, tell me how to disable Unity because it interferes with the lower panel (i.e. I bring out Trash instead of showing the desktop when I try pressing the "Show desktop" button when the panel is not in focus). I don't encounter this problem in gnome-classic (No effects). Don't judge me. I like pokemon!

    Read the article

  • Why compiz or unity refresh screen by every movement I do? [closed]

    - by Behzadsh
    It's getting me crazy! compiz or unity refresh screen (like I run compiz --replace or unity --replace) by every movement I do (e.g ctrl+tab, super+w) and somehow unexpectedly! sometimes it failed to reload title bar and keyboard functions like ctrl+tab and alt+F2 stop working, and I had no chance but reboot! Sometimes it work without any problem. I couldn't found any reason why this happen. I wanted to report a bug but I don't have enough information about it.

    Read the article

  • Unity 3D in 11.10 with VirtualBox in OS X?

    - by Roshambo
    I'm having a horrible time with Unity 3D in Ubuntu 11.10 running in a VirtualBox VM in OS X. Such a hard time, in fact, that I'm about to give up and conclude that it simply isn't possible to use Ubuntu 3D in a configuration like this. The problem with is that windows simply do not render. I've found that killing Nautilus makes the problem go away, but that's really not much of a solution. I have installed the guest additions and am running the VM with 2048 MB RAM, 128 MB video memory, and have enabled 3D acceleration. I've tried all this on several Macintoshes, with no luck. Unity 2D, on the other hand, works fine across the board. Any advice or experience would be greatly appreciated.

    Read the article

  • Unity The parameter host could not be resolved when attempting to call constructor

    - by Terrance
    When I attempt to instantiate my instance of the base class I get the error: a ResolutionFailedException with roughly the following error "The parameter host could not be resolved when attempting to call constructor" I'm currently not using an Interface for the base type and my instance of the class is inheriting the base type class. I'm new to Unity and DI so I'm thinking its something I forgot possibly. ExeConfigurationFileMap map = new ExeConfigurationFileMap(); map.ExeConfigFilename = "Unity.Config"; Configuration config = ConfigurationManager.OpenMappedExeConfiguration(map, ConfigurationUserLevel.None); UnityConfigurationSection section = (UnityConfigurationSection)config.GetSection("unity"); IUnityContainer container = new UnityContainer(); section.Containers.Default.Configure(container); //Throws exception here on this BaseCalculatorServer server = container.Resolve<BaseCalculatorServer>(); and the Unity.Config file <container> <types> <type name ="CalculatorServer" type="Calculator.Logic.BaseCalculatorServer, Calculator.Logic" mapTo="Calculator.Logic.CalculateApi, Calculator.Logic"/> </types> </container> </containers> The Base class using System; using System.Collections.Generic; using System.Linq; using System.Text; using System.Runtime.Serialization; using System.ServiceModel; using System.ServiceModel.Transactions; using Microsoft.Practices.Unity; using Calculator.Logic; namespace Calculator.Logic { public class BaseCalculatorServer : IDisposable { public BaseCalculatorServer(){} public CalculateDelegate Calculate { get; set; } public CalculationHistoryDelegate CalculationHistory { get; set; } /// <summary> /// Performs application-defined tasks associated with freeing, releasing, or resetting unmanaged resources. /// </summary> public void Dispose() { this.Dispose(); } } } The Implementation using System; using System.Collections.Generic; using System.Linq; using System.Text; using Calculator.Logic; using System.ServiceModel; using System.ServiceModel.Configuration; using Microsoft.Practices.Unity; namespace Calculator.Logic { public class CalculateApi:BaseCalculatorServer { public CalculateApi(ServiceHost host) { host.Open(); Console.WriteLine("Press Enter To Exit"); Console.ReadLine(); host.Close(); } public CalculateDelegate Calculate { get; set; } public CalculationHistoryDelegate CalculationHistory { get; set; } } } Yes both base class and implementation are in the same Namespace and thats something design wise that will change once I get this working. Oh and a more detailed error Resolution of the dependency failed, type = "Calculator.Logic.BaseCalculatorServer", name = "". Exception message is: The current build operation (build key Build Key[Calculator.Logic.BaseCalculatorServer, null]) failed: The value for the property "Calculate" could not be resolved. (Strategy type BuildPlanStrategy, index 3)

    Read the article

  • How do I remap the keyboard shortcut for Gnome Do?

    - by johnc
    I am giving Unity a chance to win me over in Natty, but I admit I am a heavy Gnome Do user and Unity has remapped the Super+Space keyboard shortcut to show the Unity Launcher. I am not yet convinced with the new Unity Launcher and would like to keep using Gnome Do, at least until such time as I am convinced that the Unity launcher is as frictionless as Gnome Do. Is it possible to remap it to Gnome Do?

    Read the article

  • After upgrading to 13.10, no applications are visible in the application lens

    - by moeso
    I upgraded from 13.04 to 13.10 and now all icons in the application lens are gone. This is what I tried so far: apt-get install --reinstall unity-lens-applications unity --replace and unity --reset-icons moving ~/.config to ~/.config2 deleting ~/.cache/software-center and ~/.cache/unity Most of these things have been suggested in this question: Unity Applications lens is empty - but all to no avail.

    Read the article

  • Why doesn't Unity's OnCollisionEnter give me surface normals, and what's the most reliable way to get them?

    - by michael.bartnett
    Unity's on collision event gives you a Collision object that gives you some information about the collision that happened (including a list of ContactPoints with hit normals). But what you don't get is surface normals for the collider that you hit. Here's a screenshot to illustrate. The red line is from ContactPoint.normal and the blue line is from RaycastHit.normal. Is this an instance of Unity hiding information to provide a simplified API? Or do standard 3D realtime collision detection techniques just not collect this information? And for the second part of the question, what's a surefire and relatively efficient way to get a surface normal for a collision? I know that raycasting gives you surface normals, but it seems I need to do several raycasts to accomplish this for all scenarios (maybe a contact point/normal combination misses the collider on the first cast, or maybe you need to do some average of all the contact points' normals to get the best result). My current method: Back up the Collision.contacts[0].point along its hit normal Raycast down the negated hit normal for float.MaxValue, on Collision.collider If that fails, repeat steps 1 and 2 with the non-negated normal If that fails, try steps 1 to 3 with Collision.contacts[1] Repeat 4 until successful or until all contact points exhausted. Give up, return Vector3.zero. This seems to catch everything, but all those raycasts make me queasy, and I'm not sure how to test that this works for enough cases. Is there a better way?

    Read the article

  • Does Unity's "Transparent Bumped Specular" translate to "semi-shiny must be semi-transparent"?

    - by Shivan Dragon
    Unity's documentation for the "Transparent Bumped Specular" shader/material-type is simply a concatenation of each of the descriptions for its Transparent and Specular Shaders (and also Bumped, but that doesn't apply to the question): Transparent Properties This shader can make mesh geometry partially or fully transparent by reading the alpha channel of the main texture. In the alpha, 0 (black) is completely transparent while 255 (white) is completely opaque. If your main texture does not have an alpha channel, the object will appear completely opaque. (...) Specular Properties (...) Additionally, the alpha channel of the main texture acts as a Specular Map (sometimes called "gloss map"), defining which areas of the object are more reflective than others. Black areas of the alpha will be zero specular reflection, while white areas will be full specular reflection. To me this translates to: I have a mesh representig a car tire The texture need to be very shiny on the rims parts, and almost not shiny at all for the rubber parts Also since the rim is really complex, (with like cut-out decoretions and such), I will not build that into the mesh, but fake it with transparency in the texture I can't do all this using Unity's "Transparent Bumped Specular" shader, because the "rubber" part of the texture will become semi transparent due to me painting the alpha channel dark-grey (because I want it to also be less shiny). Is this correct? If not, how can I make this work?

    Read the article

  • Why is Desktop Unity using the global application menu?

    - by Kazade
    It was announced in another question that the desktop version of Unity will keep the global menu by default. Here are the facts: The global menu was introduced into UNE to save vertical screen space because at Netbook resolutions the vertical space is limited. On a modern desktop with a high resolution, there is ample vertical space making this unnecessary On the announcement of UNE global menus, Mark Shuttleworth himself said the following: "There are outstanding questions about the usability of a panel-hosted menu on much larger screens, where the window and the menu could be very far apart." The benefits of a global menu don't seem to carry across to a high-resolution desktop and instead seem to bring draw backs (increased mouse travel, large distance between the menu and its associated window). The other worrying factor is that applications seem to be moving away from having a menu bar, and instead of innovating on this and defining new guidelines for moving away from the menu, we are giving it prime place right at the top of the desktop. If applications continue moving away from the desktop we will have an inconsistent experience concerning where to locate application related options/tools depending on which app you are using (e.g. Chrome). Finally, the current global menu bar implementation doesn't work for all apps, and doesn't even work for all apps in the default install. This means that the default desktop implementation will be inconsistent. So, there are a bunch of reasons why moving to a global menu is a bad idea, so we need some pretty convincing arguments for why it is a good idea. What are the reasons for the global menu implementation in the desktop version of Unity?

    Read the article

  • How can I use the dualforward parameter in my unity shader to use lightmaps and normal maps together?

    - by Raphaeltm
    I'm using the free version of unity and I would like to combine lightmaps with specularity and normal maps. After doing a -bunch- of research, I've figured out that there doesn't seem to be any easy way to do this in the free version of unity, which doesn't support deferred rendering/easy use of dual lightmaps. However, it looks like it's possible, by writing a custom shader, using the "dualforward" parameter in a shader, switching the lightmapping mode to "dual lightmaps" and turning on "Use in forward ren." (basically, writing a shader that specifies the use of dual lightmaps, which should allow for a combination of lightmaps and normal maps) So I downloaded the source code for the default shaders (because all I need is a normal specular bumped shader) and added "dualforward" to the parameters: Shader "Bumped Specular Dual Lightmaps" { Properties { _Color ("Main Color", Color) = (1,1,1,1) _SpecColor ("Specular Color", Color) = (0.5, 0.5, 0.5, 1) _Shininess ("Shininess", Range (0.03, 1)) = 0.078125 _MainTex ("Base (RGB) Gloss (A)", 2D) = "white" {} _BumpMap ("Normalmap", 2D) = "bump" {} } SubShader { Tags { "RenderType"="Opaque" } LOD 400 CGPROGRAM #pragma surface surf BlinnPhong dualforward sampler2D _MainTex; sampler2D _BumpMap; fixed4 _Color; half _Shininess; struct Input { float2 uv_MainTex; float2 uv_BumpMap; }; void surf (Input IN, inout SurfaceOutput o) { fixed4 tex = tex2D(_MainTex, IN.uv_MainTex); o.Albedo = tex.rgb * _Color.rgb; o.Gloss = tex.a; o.Alpha = tex.a * _Color.a; o.Specular = _Shininess; o.Normal = UnpackNormal(tex2D(_BumpMap, IN.uv_BumpMap)); } ENDCG } FallBack "Specular" } This, however, doesn't seem to work. When I keep the "dualforward" param, every object that uses it seems to be lit by the one directional light in the scene. When I remove the "dualforward" param, it they look like normal lightmapped objects with no normal maps or specularity. I noticed that the support for "dualforward" seems to be new in v.3.4.2, so I made sure to download it (I was running 3.4.1), but it still doesn't work. Anybody have any advice for me?

    Read the article

  • Unity, changing gravity game & stopping character when he hit a wall.

    - by Sylario
    i am currently working on a 2d puzzle game in the unity engine. One of the aspect of this game is the possibility to rotate the level of 90°. It also rotate the gravity. The main character is not directly controlled by the player, but instead fall when the level is rotated. When the main character hit a wall, he should stop to move. If i do not stop it, it kind of blink ans shake against the wall. To stop it i detect collision and depending on the current rotation state, the player will stop at "vertical" or "horizontal" tags when a OnCollisonEnter occurs. I must do that because when the player fall on his relative ground, He must not stop like if he had touch a wall. My problem is the 'side' of platform, or the 'top' of wall, they use the same tag and thus do not give the correct tag to my character. I tried to put a very small invisible box on top/side of elements but the collision occurs nevertheless. It seems when the player falls and hit something he go through a bit before being replaced at correct position by unity. Is there a way : 1 ) to doesn't stop my character but to make it appear immobile on screen 2 ) To detect a "i cannot move anymore" collision other than by using collision?

    Read the article

  • For photography use, is Unity is overheating my laptop? Should I try OpenSuse instead?

    - by SoT
    I am a perfect noob here in the Linux world. Previously was using Windows 7. Mine is an HP laptop - Intel core2duo T5470 @ 1.60GHz × 2 / 965GM with 2GB RAM. I installed Ubuntu 12.04TLS and is quite liking it's display. I really recognized it is 3D before knowing it was Unity 3D interface. My uses are image editing, home uses, downloads, browsing etc.. No video-editing/gaming at all. Being a Photography enthusiast I use image editing programs fairly more. But I am now feeling my laptop is getting a bit overheated - processor and hard-disk. I tried lm-sensor and could not make out much of it. Installed Xsensors.7. It gives the same output as lm-sensors gave me. It gives temperature for 4 things Temp1, temp2, temp3, and temp4. For "acpitz". Please guide me in this. However I wanted to ask something more. Which one is better for working with images - photography I mean - openSUSE 12.1 or Ubuntu with unity 3D? Can I get the display quality with the openSUSE distribution? I heard for laptops openSUSE uses power more efficiently, is there any truth? Please suggest me whether I should try openSUSE or not. If so with which GUI? KDE or GNOME? Thanks in advance. Regards SoT

    Read the article

  • Cannot resolve Dictionary in Unity container

    - by IanR
    Hi, I've just stumbled upon this: within a Unity container, I want to register IDictionary<TK, TV>; assume that it's IDictionary<string, int> _unityContainer = new UnityContainer() .RegisterType<IDictionary<string, int>, Dictionary<string, int>>(); but if I try var d = _unityContainer.Resolve<IDictionary<string, int>>(); it fails to resolve... I get... Microsoft.Practices.Unity.ResolutionFailedException: Microsoft.Practices.Unity.ResolutionFailedException: Resolution of the dependency failed, type = "System.Collections.Generic.IDictionary`2[System.String,System.Int32]", name = "(none)". Exception occurred while: while resolving. Exception is: InvalidOperationException - The type Dictionary`2 has multiple constructors of length 2. Unable to disambiguate. At the time of the exception, the container was: Resolving System.Collections.Generic.Dictionary2[System.String,System.Int32],(none) (mapped from System.Collections.Generic.IDictionary2[System.String,System.Int32], (none)) --- System.InvalidOperationException: The type Dictionary`2 has multiple constructors of length 2. Unable to disambiguate.. So it looks like it has found the Type to resolve (being Dictionary<string, int>) but failed to new it up... How come unity can't resolve this type? If I type IDictionary<string, int> d = new Dictionary<string, int>() that works... any ideas? thanks!

    Read the article

  • Combining MVVM Light Toolkit and Unity 2.0

    - by Alan Cordner
    This is more of a commentary than a question, though feedback would be nice. I have been tasked to create the user interface for a new project we are doing. We want to use WPF and I wanted to learn all of the modern UI design techniques available. Since I am fairly new to WPF I have been researching what is available. I think I have pretty much settled on using MVVM Light Toolkit (mainly because of its "Blendability" and the EventToCommand behavior!), but I wanted to incorporate IoC also. So, here is what I have come up with. I have modified the default ViewModelLocator class in a MVVM Light project to use a UnityContainer to handle dependency injections. Considering I didn't know what 90% of these terms meant 3 months ago, I think I'm on the right track. // Example of MVVM Light Toolkit ViewModelLocator class that implements Microsoft // Unity 2.0 Inversion of Control container to resolve ViewModel dependencies. using Microsoft.Practices.Unity; namespace MVVMLightUnityExample { public class ViewModelLocator { public static UnityContainer Container { get; set; } #region Constructors static ViewModelLocator() { if (Container == null) { Container = new UnityContainer(); // register all dependencies required by view models Container .RegisterType<IDialogService, ModalDialogService>(new ContainerControlledLifetimeManager()) .RegisterType<ILoggerService, LogFileService>(new ContainerControlledLifetimeManager()) ; } } /// <summary> /// Initializes a new instance of the ViewModelLocator class. /// </summary> public ViewModelLocator() { ////if (ViewModelBase.IsInDesignModeStatic) ////{ //// // Create design time view models ////} ////else ////{ //// // Create run time view models ////} CreateMain(); } #endregion #region MainViewModel private static MainViewModel _main; /// <summary> /// Gets the Main property. /// </summary> public static MainViewModel MainStatic { get { if (_main == null) { CreateMain(); } return _main; } } /// <summary> /// Gets the Main property. /// </summary> [System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Performance", "CA1822:MarkMembersAsStatic", Justification = "This non-static member is needed for data binding purposes.")] public MainViewModel Main { get { return MainStatic; } } /// <summary> /// Provides a deterministic way to delete the Main property. /// </summary> public static void ClearMain() { _main.Cleanup(); _main = null; } /// <summary> /// Provides a deterministic way to create the Main property. /// </summary> public static void CreateMain() { if (_main == null) { // allow Unity to resolve the view model and hold onto reference _main = Container.Resolve<MainViewModel>(); } } #endregion #region OrderViewModel // property to hold the order number (injected into OrderViewModel() constructor when resolved) public static string OrderToView { get; set; } /// <summary> /// Gets the OrderViewModel property. /// </summary> public static OrderViewModel OrderViewModelStatic { get { // allow Unity to resolve the view model // do not keep local reference to the instance resolved because we need a new instance // each time - the corresponding View is a UserControl that can be used multiple times // within a single window/view // pass current value of OrderToView parameter to constructor! return Container.Resolve<OrderViewModel>(new ParameterOverride("orderNumber", OrderToView)); } } /// <summary> /// Gets the OrderViewModel property. /// </summary> [System.Diagnostics.CodeAnalysis.SuppressMessage("Microsoft.Performance", "CA1822:MarkMembersAsStatic", Justification = "This non-static member is needed for data binding purposes.")] public OrderViewModel Order { get { return OrderViewModelStatic; } } #endregion /// <summary> /// Cleans up all the resources. /// </summary> public static void Cleanup() { ClearMain(); Container = null; } } } And the MainViewModel class showing dependency injection usage: using GalaSoft.MvvmLight; using Microsoft.Practices.Unity; namespace MVVMLightUnityExample { public class MainViewModel : ViewModelBase { private IDialogService _dialogs; private ILoggerService _logger; /// <summary> /// Initializes a new instance of the MainViewModel class. This default constructor calls the /// non-default constructor resolving the interfaces used by this view model. /// </summary> public MainViewModel() : this(ViewModelLocator.Container.Resolve<IDialogService>(), ViewModelLocator.Container.Resolve<ILoggerService>()) { if (IsInDesignMode) { // Code runs in Blend --> create design time data. } else { // Code runs "for real" } } /// <summary> /// Initializes a new instance of the MainViewModel class. /// Interfaces are automatically resolved by the IoC container. /// </summary> /// <param name="dialogs">Interface to dialog service</param> /// <param name="logger">Interface to logger service</param> public MainViewModel(IDialogService dialogs, ILoggerService logger) { _dialogs = dialogs; _logger = logger; if (IsInDesignMode) { // Code runs in Blend --> create design time data. _dialogs.ShowMessage("Running in design-time mode!", "Injection Constructor", DialogButton.OK, DialogImage.Information); _logger.WriteLine("Running in design-time mode!"); } else { // Code runs "for real" _dialogs.ShowMessage("Running in run-time mode!", "Injection Constructor", DialogButton.OK, DialogImage.Information); _logger.WriteLine("Running in run-time mode!"); } } public override void Cleanup() { // Clean up if needed _dialogs = null; _logger = null; base.Cleanup(); } } } And the OrderViewModel class: using GalaSoft.MvvmLight; using Microsoft.Practices.Unity; namespace MVVMLightUnityExample { /// <summary> /// This class contains properties that a View can data bind to. /// <para> /// Use the <strong>mvvminpc</strong> snippet to add bindable properties to this ViewModel. /// </para> /// <para> /// You can also use Blend to data bind with the tool's support. /// </para> /// <para> /// See http://www.galasoft.ch/mvvm/getstarted /// </para> /// </summary> public class OrderViewModel : ViewModelBase { private const string testOrderNumber = "123456"; private Order _order; /// <summary> /// Initializes a new instance of the OrderViewModel class. /// </summary> public OrderViewModel() : this(testOrderNumber) { } /// <summary> /// Initializes a new instance of the OrderViewModel class. /// </summary> public OrderViewModel(string orderNumber) { if (IsInDesignMode) { // Code runs in Blend --> create design time data. _order = new Order(orderNumber, "My Company", "Our Address"); } else { _order = GetOrder(orderNumber); } } public override void Cleanup() { // Clean own resources if needed _order = null; base.Cleanup(); } } } And the code that could be used to display an order view for a specific order: public void ShowOrder(string orderNumber) { // pass the order number to show to ViewModelLocator to be injected //into the constructor of the OrderViewModel instance ViewModelLocator.OrderToShow = orderNumber; View.OrderView orderView = new View.OrderView(); } These examples have been stripped down to show only the IoC ideas. It took a lot of trial and error, searching the internet for examples, and finding out that the Unity 2.0 documentation is lacking (at best) to come up with this solution. Let me know if you think it could be improved.

    Read the article

  • Multi tenancy with Unity

    - by Savvas Sopiadis
    Hi everybody! I'm trying to implement this scenario using Unity and i can't figure out how this could be done: the same web application (ASP.NET MVC) should be made accessible to more than one client (multi-tenant). The URL of the web site will differentiate the client (this i know how to get). So getting the URL one could set the (let's call it) IConnectionStringProvider parameter (which will be afterward injected into IRepository and so on). Through which mechanism (using Unity) do i set the IConnectionStringProvider parameter at run time? I have done this in the past using Windsor & IHandlerSelector (see this) but it's my first attempt using Unity. Any help is deeply appreciated! Thanks in advance

    Read the article

< Previous Page | 15 16 17 18 19 20 21 22 23 24 25 26  | Next Page >