Search Results

Search found 29820 results on 1193 pages for 'default implementation'.

Page 210/1193 | < Previous Page | 206 207 208 209 210 211 212 213 214 215 216 217  | Next Page >

  • Can't disable giant cursors (from accessibility mode)

    - by jackweirdy
    I've just installed ubuntu 12.04 from a livecd. Out of curiosity, I enabled the accessibility options for people who are hard of sight. As you can guess this does the usual stuff of inverting colours, increasing text size and making the cursor larger. Having finished the installation I booted into the new system to find accessibility mode was still installed. From the lightdm login screen I disabled this which switched colours and text size back to default, however it's only the pointer cursor that has gone back to default. To put it another way, the "hand" icon that you get when hovering over a link, the cursor which appears when typing and pretty much every other cursor on the system are still large. I've looked on the Universal Access menu, but there's no option to disable large cursors. I've tried toggling accessibility on and off but to no avail.

    Read the article

  • Using Network load balancing to distribute load for SharePoint2010 – Part3 of building my own development SharePoint2010 Farm

    - by ybbest
    Part1 of building my own development SharePoint2010 Farm Part2 of building my own development SharePoint2010 Farm Part3 of building my own development SharePoint2010 Farm In my last post, I have installed SharePoint2010 in one of the server (WFE One) and configured using the OOB SharePoint configuration wizard. In this post I will show you how to use OOB windows network load balancing to distribute load for SharePoint2010 site. 1. Install SharePoint in another server WFE Two (you can follow the steps in my last post), but instead of choosing create new Farm, you need to select “connect to existing farm” this time. 2. Click next then click retrieve database names button and select the farm configuration database. 3. Click next and enter the passphrase you specified when you first installed the SharePoint Farm. 4. Click the advanced settings and select Use this machine to host the web site. 5. Click OK to finish the configurations 6. Next, Install NLB in the two WFE (web front end) SharePoint servers 7. Configure NLB to create the cluster. Go to Start—Administrative Tools—Network Load Balancing Manager 8. Right-click the Network Load Balancing Clusters Node and select New Cluster. 9. Type in the host name that is to be part of the new cluster. 10. Type in the IP address for the cluster. 11. Select the Multicast for this cluster.(The default one is Unicast) 12. You can configure the Port Rules for the clustering , but I will leave the default here. 13. Add another WEF to the cluster. 14. Type in the host name that is to be part of the new cluster. 15. Set the Priority to 2. 16. Click Next to complete the cluster setup. 17. Create an entry in the DNS for the new cluster. 18. Add the binding to the IIS site in the IIS Manager 19. Change the Alternate access mapping for you default site collection from http://sp2010wefone to http://team 20. Browse to http://Team , you will be redirected to the SharePoint site.

    Read the article

  • How to prevent Gnome desktop from crashing?

    - by nixnotwin
    I have ubuntu 10.10 32bit running on my Asus EEEPC 1005PX. I am experiencing frequent desktop crashes. When I turn on my netbook at least once in 5 times the defualt ubuntu theme disapears and the classic gnome theme appears. Many times while doing some work, the desktop crashes and the CLI gets shown, and after a few seconds the login screen appers. I am not using any widgets or dock bars, I just have a single gnome panel with default menus. The crashes also happen when using the default bundled ubuntu apps. Is there any way to avoid these crashes?

    Read the article

  • OpenGL behaviour depending on the graphics card?

    - by Dan
    This is something that never happened to me before. I have an OpenGL code that uses GLSL shaders to texture a 3D model. The code involves a lot of GPU texture processing, blending, etc... I wanted to check how the performance of my code improves using a faster graphics card (both new and old are NVIDIA, using always the NVIDIA development drivers). But now I have found that once I run the code using the new graphics card, it behaves completely different (the final render looks wrong), probably because some blending effect is not performed correctly. I haven't really look into what has changed, but I am guessing that some OpenGL states are, by default, set different. Is this possible? Have you ever found different OpenGL/GLSL behaviour using different graphics cards? Any "fast" solution? (So far I've thought of plugging back the old one, push all OpenGL default states, and compare with the ones I initially get using the new card..)

    Read the article

  • Writing to a structured buffer with a compute shader (D3D11)

    - by Vertexwahn
    I have some problems writing to a structured buffer. First I create a structured buffer that is filled with float values beginning from 0 to 99. Afterwards a copy the structured buffer to a CPU accessible buffer is made to print the content of the structured buffer to the console. The output is as expected (Numbers 0 to 99 appear on the console). Afterwards I use a compute shader that should change the contents of the structured buffer: RWStructuredBuffer<float> Result : register( u0 ); [numthreads(1, 1, 1)] void CS_main( uint3 GroupId : SV_GroupID ) { Result[GroupId.x] = GroupId.x * 10; } But the compute shader does not change the contents of the structured buffer. The source code can be found here (main.cpp): https://bitbucket.org/Vertexwahn/cmakedemos/src/4abb067afd5781b87a553c4c720956668adca22a/D3D11ComputeShader/src/main.cpp?at=default FillCS.hlsl: https://bitbucket.org/Vertexwahn/cmakedemos/src/4abb067afd5781b87a553c4c720956668adca22a/D3D11ComputeShader/src/FillCS.hlsl?at=default

    Read the article

  • How to migrate from Banshee to Rhythmbox?

    - by rafalcieslak
    As it has been decided, Ubuntu Precise 12.04 will feature Rhythmbox as the default music player. I am aware, that it does not mean that I will not be able to use Banshee, nevertheless I would like to switch to it. I have been a Rhythmbox fan for a long time, but after the switch to Banshee in Natty I decided to give it a try and completely migrated to it. However, I am not very happy with it, it lags for me a lot and has some other issues. I would like to export all Banshee data to Rhythmbox. That includes: Music library Playlists Preferably playcounts and ratings Radio stations Cover pictures What should I do to move all this data to Rhythmbox, get it to work as the default music player, and smoothly switch completely to it?

    Read the article

  • How to create Windows XP LiveUSB using Ubuntu to replace it

    - by Orion Clark
    I am using an Acer Aspire One netbook with no CD-disk drive, and would like to uninstall Ubuntu 12.04 LTS and install Windows XP in its place. The problem here is that I can't seem to find a program that can put the windows boot files on a USB drive from an ISO file. I have Ubuntu fully installed and have tried using unetbootin. When I tried booting from unetbootin I got a screen with a blue box that had the word "default" in it highlighted. underneath the box there was a countdown that said "will boot from default in 10" after the countdown finished the number would revert to ten and nothing would happen. Can someone tell me another program that would be useful for this please?

    Read the article

  • Where should the database and mail parameters be stored in a Symfony2 app?

    - by Songo
    In the default folder structure for a Symfony2 project the database and mail server credentials are stored in parameters.yml file inside ProjectRoot/app/config/parameters.yml with these default values: parameters: database_driver: pdo_mysql database_host: 127.0.0.1 database_port: null database_name: symfony database_user: root database_password: null mailer_transport: smtp mailer_host: 127.0.0.1 mailer_user: null mailer_password: null locale: en secret: ThisTokenIsNotSoSecretChangeIt During development we change these parameters to the development database and mail servers. This file is checked into the source code repository. The problem is when we want to deploy to the production server. We are thinking about automating the deployment process by checking out the project from git and deploy it to the production server. The thing is that our project manager has to manually update these parameters after each update. The production database and mail servers parameters are confidential and only our project manager knows them. I need a way to automate this step and suggestion on where to store the production parameters until they are applied?

    Read the article

  • Inside BackgroundWorker

    - by João Angelo
    The BackgroundWorker is a reusable component that can be used in different contexts, but sometimes with unexpected results. If you are like me, you have mostly used background workers while doing Windows Forms development due to the flexibility they offer for running a background task. They support cancellation and give events that signal progress updates and task completion. When used in Windows Forms, these events (ProgressChanged and RunWorkerCompleted) get executed back on the UI thread where you can freely access your form controls. However, the logic of the progress changed and worker completed events being invoked in the thread that started the background worker is not something you get directly from the BackgroundWorker, but instead from the fact that you are running in the context of Windows Forms. Take the following example that illustrates the use of a worker in three different scenarios: – Console Application or Windows Service; – Windows Forms; – WPF. using System; using System.ComponentModel; using System.Threading; using System.Windows.Forms; using System.Windows.Threading; class Program { static AutoResetEvent Synch = new AutoResetEvent(false); static void Main() { var bw1 = new BackgroundWorker(); var bw2 = new BackgroundWorker(); var bw3 = new BackgroundWorker(); Console.WriteLine("DEFAULT"); var unspecializedThread = new Thread(() => { OutputCaller(1); SynchronizationContext.SetSynchronizationContext( new SynchronizationContext()); bw1.DoWork += (sender, e) => OutputWork(1); bw1.RunWorkerCompleted += (sender, e) => OutputCompleted(1); // Uses default SynchronizationContext bw1.RunWorkerAsync(); }); unspecializedThread.IsBackground = true; unspecializedThread.Start(); Synch.WaitOne(); Console.WriteLine(); Console.WriteLine("WINDOWS FORMS"); var windowsFormsThread = new Thread(() => { OutputCaller(2); SynchronizationContext.SetSynchronizationContext( new WindowsFormsSynchronizationContext()); bw2.DoWork += (sender, e) => OutputWork(2); bw2.RunWorkerCompleted += (sender, e) => OutputCompleted(2); // Uses WindowsFormsSynchronizationContext bw2.RunWorkerAsync(); Application.Run(); }); windowsFormsThread.IsBackground = true; windowsFormsThread.SetApartmentState(ApartmentState.STA); windowsFormsThread.Start(); Synch.WaitOne(); Console.WriteLine(); Console.WriteLine("WPF"); var wpfThread = new Thread(() => { OutputCaller(3); SynchronizationContext.SetSynchronizationContext( new DispatcherSynchronizationContext()); bw3.DoWork += (sender, e) => OutputWork(3); bw3.RunWorkerCompleted += (sender, e) => OutputCompleted(3); // Uses DispatcherSynchronizationContext bw3.RunWorkerAsync(); Dispatcher.Run(); }); wpfThread.IsBackground = true; wpfThread.SetApartmentState(ApartmentState.STA); wpfThread.Start(); Synch.WaitOne(); } static void OutputCaller(int workerId) { Console.WriteLine( "bw{0}.{1} | Thread: {2} | IsThreadPool: {3}", workerId, "RunWorkerAsync".PadRight(18), Thread.CurrentThread.ManagedThreadId, Thread.CurrentThread.IsThreadPoolThread); } static void OutputWork(int workerId) { Console.WriteLine( "bw{0}.{1} | Thread: {2} | IsThreadPool: {3}", workerId, "DoWork".PadRight(18), Thread.CurrentThread.ManagedThreadId, Thread.CurrentThread.IsThreadPoolThread); } static void OutputCompleted(int workerId) { Console.WriteLine( "bw{0}.{1} | Thread: {2} | IsThreadPool: {3}", workerId, "RunWorkerCompleted".PadRight(18), Thread.CurrentThread.ManagedThreadId, Thread.CurrentThread.IsThreadPoolThread); Synch.Set(); } } Output: //DEFAULT //bw1.RunWorkerAsync | Thread: 3 | IsThreadPool: False //bw1.DoWork | Thread: 4 | IsThreadPool: True //bw1.RunWorkerCompleted | Thread: 5 | IsThreadPool: True //WINDOWS FORMS //bw2.RunWorkerAsync | Thread: 6 | IsThreadPool: False //bw2.DoWork | Thread: 5 | IsThreadPool: True //bw2.RunWorkerCompleted | Thread: 6 | IsThreadPool: False //WPF //bw3.RunWorkerAsync | Thread: 7 | IsThreadPool: False //bw3.DoWork | Thread: 5 | IsThreadPool: True //bw3.RunWorkerCompleted | Thread: 7 | IsThreadPool: False As you can see the output between the first and remaining scenarios is somewhat different. While in Windows Forms and WPF the worker completed event runs on the thread that called RunWorkerAsync, in the first scenario the same event runs on any thread available in the thread pool. Another scenario where you can get the first behavior, even when on Windows Forms or WPF, is if you chain the creation of background workers, that is, you create a second worker in the DoWork event handler of an already running worker. Since the DoWork executes in a thread from the pool the second worker will use the default synchronization context and the completed event will not run in the UI thread.

    Read the article

  • Upgrade from 10.10 to 11.04

    - by hemanta pathak
    On doing an upgrade from 10.10. to 11.04 using Upgrade Manager everything works fine. But on installing a software that bundle its owns runtime environment( loader and system files) and installs the custom runtime ( basically loader) in the location where the native one resides, the above mentioned upgrade fails.(Upgrade starts and after sometime it encounters an error and aborts.) Basically , /usr/bin/dpkg throws up an error on being unable to locate a system shared library in the aforesaid third party runtime folder ( /usr/bin/dpkg should not search the third party runtime folder.Instead it should look at the system default folder) But if we remove the installed third party loader from the default system location /lib and place it in some other location , the upgrade problem goes away. This makes me believe /usr/bin/dpkg invokes(loads) the wrong loader and as such goes looking for the dependent libraries in the third party folder. Can someone take a look at this ? is there some bug with dpkg

    Read the article

  • Google Analytics Dashboard: week-by-week view

    - by Silver Dragon
    Setting up Google Analytics Dashboard allows webmasters to get a weekly progress report of marketing achievements & keep a finger on what's going on at web properties. However, by default, the dashboard always displays a day-by-day report, which isn't actionable in markets, where meaningful improvements happen on a week-by-week, or month-over-month basis. Is there any way the default view (and reports sent out via email) can be set to display week-level resolution, as opposed to day-level resolution? (ie, repro: analytics - site - Standard reports - audience - overview - right side of the window, click "weeK") Many thanks!

    Read the article

  • Thinktecture IdentityServer Azure Edition RC

    - by Your DisplayName here!
    I found some time over the holidays to finalize the Azure edition of IdentityServer. http://identityserver.codeplex.com/releases/view/81206 The biggest difference to the on-premise version (and earlier Azure betas) is, that by default IdSrv now uses Azure Storage for all data storage (configuration & user data). This means that there is no need anymore for SQL Azure (which is still supported out of the box – just not the default anymore). The download includes a readme file with setup instructions. In a nutshell: Create a new hosted service and upload your certificates Modify the service configuration file in the download to your needs (signing cert, connection strings to storage…) Deploy the package via the portal or other tools Use the new Powershell scripts to add users If you encounter any problem, please give me feedback.

    Read the article

  • Trying to boot from usb

    - by iron
    I have similar issues but none with the 12.04 build. Im trying this on a dell d610 laptop with a bad hard drive and was told i could just directly boot from the usb drive. Im using an 8g usb drive and i have tried using the uui tool and get this message. SYSLINUX 4.06 EDD 4.06-pre1 Copyright (c) 1994-2011 H.Peter Anvinet al ERROR: No configuration file found No DEFAULT or UI configuration directive found! Then i tried using unetbootin and got the bootup screen with only the default option and it would say auto boot with a 10 sec countdown and start over again. I do have the boot sequence for usb first.

    Read the article

  • How to replace GRUB with BURG without overwriting MBR?

    - by StepTNT
    I'd like to install BURG but I don't want it to overwrite my MBR. That's because I've got two bootloaders in my system : Default Windows 7 in MBR Grub in /dev/sda2 With the first power button, it will boot into Windows and with the second button it boots from /dev/sda2, loading Grub and then Kubuntu. Now I want to replace GRUB with BURG. I've installed Burg with burg-install /dev/sda2 --force because I don't want my MBR to be overwritten (pressing the first power button MUST load Windows and avoid showing any sign of Linux). Setup was completed without errors, If I open burg-manager it loads my settings, allowing me to change them and to test everything with burg-emu, so I've changed my settings but the second power button still loads GRUB (Even a different version from the default Kubuntu one). How can I replace GRUB that's on /dev/sda2 with BURG and without overwriting MBR?

    Read the article

  • Checking whether a specific key was pressed in enchantJS

    - by MxyL
    I am using enchantJS and would like to use the letters and numbers as well as numpad on a keyboard to do different things (eg: hotkeys). From this page http://users.csc.calpoly.edu/~foaad/enchant/guide/playerInput.html By default, enchant.js provides input listeners for six buttons: UP, DOWN, LEFT, RIGHT, A, and B. By default, the directions are bound to the arrow keys. Any of the six buttons may also be bound to any key with an ASCII value. We’ll address that later. So enchant provides the ability to bind keys to different input such as up, down, left, right...but how can I simply check whether the D or X key was pressed, and if so, perform certain actions based on that event?

    Read the article

  • Nvidia GT218 repository drivers don't work

    - by user1042840
    I upgraded all packages with sudo apt-get upgrade command on my Ubuntu 10.04 box and I have Ubuntu 12.04 3.2.0-29-generic-pae now. I have two monitors and the following GPU: 01:00.0 VGA compatible controller: NVIDIA Corporation GT218 [NVS 300] (rev a2) After upgrading to 12.04, I somehow lost my previous setup with one common workspace stretched across two monitors. When Ubuntu starts only one monitor is on. I can see the message on the active monitor: Not optimum mode. Recommended mode: 1680x1050 60Hz I used Nvidia proprietary drivers on 10.04 but now jockey-text --list shows: xorg:nvidia_current - NVIDIA accelerated graphics driver (Proprietary, Disabled, Not in use) xorg:nvidia_current_updates - NVIDIA accelerated graphics driver (post-release updates) (Proprietary, Enabled, Not in use) When I run sudo nvidia-settings it says You do not appear to be using the NVIDIA X driver. Please edit your X configuration file (just run `nvidia-xconfig` as root), and restart the X server.' I typed nvidia-xconfig and rebooted, but jockey-text --list says the same after the reboot: Not in use. The same with nvidia-current - Enabled but Not in use. I also tried nvidia-173 but I ended up in tty immediately at startup so I removed it. I used to have some problems with Nvidia proprietary drivers on 10.04, I had to put paths to EDID files in /etc/X11/xorg.conf explicitly, but the resolution was as recommended and both monitors were working. If I understand correctly, nouveau drivers are used now by default because the resolution is still quite high, definitely not 800x600, xrandr showed: xrandr: Failed to get size of gamma for output default Screen 0: minimum 320 x 400, current 1600 x 1200, maximum 1600 x 1200 default connected 1600x1200+0+0 0mm x 0mm 1600x1200 66.0* 1280x1024 76.0 1024x768 76.0 800x600 73.0 640x480 73.0 640x400 0.0 320x400 0.0 1680x1050_60.00 (0x4f) 146.2MHz h: width 1680 start 1784 end 1960 total 2240 skew 0 clock 65.3KHz v: height 1050 start 1053 end 1059 total 1089 clock 60.0Hz However, colors seem a bit faded and blurry with nouveau drivers. Mouse cursor is invisible if it's placed inside Firefox window, and only one monitor is working. I like open source and if it's possible I'd prefer to use nouveau drivers but a few things should be fixed. I'm curious why nvidia-current drivers from the repository don't work now. I read it has something to do with the new X11 server in Ubuntu 12.04, is it true? How can I get it back to work?

    Read the article

  • Which Compiz plugins can I safely disable?

    - by Sheldno
    On 11.10, which Compiz plugins are enabled by default? I.e. I should leave them enabled for some particular reason. I've imported my Compiz settings from my previous install and so I can't tell. I'd like to disable all unnecessary plugins because I've noticed the potential for interference or performance issues. I don't think I can trust the CCSM feature "Reset to defaults" because when I use it I need to run unity --reset afterwards in order for my desktop to work again (implying that the default values it resets to are not the values that Ubuntu/Unity requires).

    Read the article

  • How You Helped Shape Java EE 7...

    - by reza_rahman
    I have been working with the JCP in various roles since EJB 3/Java EE 5 (much of it on my own time), eventually culminating in my decision to accept my current role at Oracle (despite it's inevitable set of unique challenges, a role I find by and large positive and fulfilling). During these years, it has always been clear to me that pretty much everyone in the JCP genuinely cares about openness, feedback and developer participation. Perhaps the most visible sign to date of this high regard for grassroots level input is a survey on Java EE 7 gathered a few months ago. The survey was designed to get open feedback on a number of critical issues central to the Java EE 7 umbrella specification including what APIs to include in the standard. When we started the survey, I don't think anyone was certain what the level of participation from developers would really be. I also think everyone was pleasantly surprised that a large number of developers (around 1100) took the time out to vote on these very important issues that could impact their own professional life. And it wasn't just a matter of the quantity of responses. I was particularly impressed with the quality of the comments made through the survey (some of which I'll try to do justice to below). With Java EE 7 under our belt and the horizons for Java EE 8 emerging, this is a good time to thank everyone that took the survey once again for their thoughts and let you know what the impact of your voice actually was. As an aside, you may be happy to know that we are working hard behind the scenes to try to put together a similar survey to help kick off the agenda for Java EE 8 (although this is by no means certain). I'll break things down by the questions asked in the survey, the responses and the resulting change in the specification. APIs to Add to Java EE 7 Full/Web Profile The first question in the survey asked which of four new candidate APIs (WebSocket, JSON-P, JBatch and JCache) should be added to the Java EE 7 Full and Web profile respectively. Developers by and large wanted all the new APIs added to the full platform. The comments expressed particularly strong support for WebSocket and JCache. Others expressed dissatisfaction over the lack of a JSON binding (as opposed to JSON processing) API. WebSocket, JSON-P and JBatch are now part of Java EE 7. In addition, the long-awaited Java EE Concurrency Utilities API was also included in the Full Profile. Unfortunately, JCache was not finalized in time for Java EE 7 and the decision was made not to hold up the Java EE release any longer. JCache continues to move forward strongly and will very likely be included in Java EE 8 (it will be available much sooner than Java EE 8 to boot). An emergent standard for JSON-B is also a strong possibility for Java EE 8. When it came to the Web Profile, developers were supportive of adding WebSocket and JSON-P, but not JBatch and JCache. Both WebSocket and JSON-P are now part of the Web Profile, now also including the already popular JAX-RS API. Enabling CDI by Default The second question asked whether CDI should be enabled in Java EE by default. The overwhelming majority of developers supported the default enablement of CDI. In addition, developers expressed a desire for better CDI/Java EE alignment (with regards to EJB and JSF in particular). Some developers expressed legitimate concerns over the performance implications of enabling CDI globally as well as the potential conflict with other JSR 330 implementations like Spring and Guice. CDI is enabled by default in Java EE 7. Respecting the legitimate concerns, CDI 1.1 was very careful to add additional controls around component scanning. While a lot of work was done in Java EE 6 and Java EE 7 around CDI alignment, further alignment is under serious consideration for Java EE 8. Consistent Usage of @Inject The third question was around using CDI/JSR 330 @Inject consistently vs. allowing JSRs to create their own injection annotations (e.g. @BatchContext). A majority of developers wanted consistent usage of @Inject. The comments again reflected a strong desire for CDI/Java EE alignment. A lot of emphasis in Java EE 7 was put into using @Inject consistently. For example, the JBatch specification is focused on using @Inject wherever possible. JAX-RS remains an exception with it's existing custom injection annotations. However, the JAX-RS specification leads understand the importance of eventual convergence, hopefully in Java EE 8. Expanding the Use of @Stereotype The fourth question was about expanding CDI @Stereotype to cover annotations across Java EE beyond just CDI. A solid majority of developers supported the idea of making @Stereotype more universal in Java EE. The comments maintained the general theme of strong support for CDI/Java EE alignment Unfortunately, there was not enough time and resources in Java EE 7 to implement this fairly pervasive feature. However, it remains a serious consideration for Java EE 8. Expanding Interceptor Use The final set of questions was about expanding interceptors further across Java EE. Developers strongly supported the concept. Along with injection, interceptors are now supported across all Java EE 7 components including Servlets, Filters, Listeners, JAX-WS endpoints, JAX-RS resources, WebSocket endpoints and so on. I hope you are encouraged by how your input to the survey helped shape Java EE 7 and continues to shape Java EE 8. Participating in these sorts of surveys is of course just one way of contributing to Java EE. Another great way to stay involved is the Adopt-A-JSR Program. A large number of developers are already participating through their local JUGs. You could of course become a Java EE JSR expert group member or observer. You should stay tuned to The Aquarium for the progress of Java EE 8 JSRs if that's something you want to look into...

    Read the article

  • Alsa doesn't work in vlc

    - by freebird
    Alsa Audio Output works fine from terminal, e.g. aplay /usr/share/sounds/alsa/Noise.wav. But I got to change from default to Alsa Audio Output in vlc. I found it in Tools Perfernces Audio Outputs. The issue is that when I change it to Alsa, I Loose all sound. When I leave the default I get an annoying Audio delay of about 200ms or 500ms. From what I have found you have to use Alsa Audio Outpu to fix that issue. Updated 6-26-2011 10:28pm To fix the Alsa Audio Output: sudo add-apt-repository ppa:ferramroberto/vlc sudo apt-get update sudo apt-get install vlc mozilla-plugin-vlc then, opened Update Manager, there were 2 updates for vlc there, I installed them and rebooted. Now alsa works fine and audio is in sync with video.

    Read the article

  • Bring 2 GB Large Pages to Solaris 10

    - by Giri Mandalika
    Few facts: 8 KB is the default page size on Oracle Solaris 10 and 11 as of this writing Both hardware and software must have support for 2 GB large pages SPARC T4 processors are capable of supporting 2 GB pages Oracle Solaris 11 kernel has in-built support for 2 GB pages Oracle Solaris 10 has no default support for 2 GB pages Memory intensive 64-bit applications may benefit the most from using 2 GB pages Prerequisites: OS: Oracle Solaris 10 8/11 (Update 10) or later Hardware: Oracle servers with SPARC T4 processors e.g., SPARC T4-1, T4-2 or T4-4, SPARC SuperCluster T4-4 Steps to enable 2 GB large pages on Oracle Solaris 10: Install the latest kernel patch or ensure that 147440-04 or later was installed Check the patch download instructions Add the following line to /etc/system and reboot set max_uheap_lpsize=0x80000000 Finally check the output of the following command when the system is back online pagesize -a eg., % pagesize -a 8192 <-- 8K 65536 <-- 64K 4194304 <-- 4M 268435456 <-- 256M 2147483648 <-- 2G % uname -a SunOS jar-jar 5.10 Generic_147440-21 sun4v sparc sun4v Also See: Solaris 9 or later: More performance with Large Pages (MPSS) Large page support for instructions (text) in Solaris 10 1/06 Solaris: How To Disable Out Of The Box (OOB) Large Page Support? Memory fragmentation / Large Pages on Solaris x86

    Read the article

  • VirtualBox: Import on different platform

    - by katsumii
    VirtualBox behaves almost the same on different OSes but the default network name for host-only network is different.The picture below is from Windows. On Linux, it will be named "vboxnet0" by default.This causes a problem on cross-platform export/import and it's already reported.#7067 (VERRINTERNALERROR: Inexistent host networking interface, named 'vboxnet0') – Oracle VM VirtualBoxVERR_INTERNAL_ERROR: Inexistent host networking interface, named 'vboxnet0'There are at least 2 workarounds.Open the imported VM in VBox GUI and pick correct network.Run CUI command like below. This one is for Windows.VBoxManage.exe modifyvm node1_1  --hostonlyadapter1 "VirtualBox Host-Only Ethernet Adapter"

    Read the article

  • power manger keeps on shuting down the display even when i state in the power manger to never do that.

    - by david25
    i'm probably missing something here, i'm using the default ubuntu power manger, i setup it like that: on AC: no screen dimming when ideal. never put computer to sleep. never put display to sleep. on battery i kept the default setting. and still ubuntu does what ever it likes :\ , after 15 min it puts the display to sleep. any one having the same problem and found a way to bypass it? im using eee pc with ubuntu 10.10 desktop.

    Read the article

  • Setting up International Keyboard -layouts over X? Why do my kbd -layouts get reseted after reboot?

    - by hhh
    I have asked a related question in different sites such as here in German and a related thread here, a different case in the latter though. I almost solved the question here, basically: "/etc/default/keyboard" -modification and one-line "$ setxkbmap -option grp:caps_toggle -variant dvorak-intl,nodeadkeys, us,de,no &" -- but the layout-settings get reseted after reboot. I use Debian but I believe the same settings apply to Ubuntu hence asking here. So how can I get settings to stay after rebooting? $ cat /etc/default/keyboard XKBMODEL="pc105" XKBLAYOUT="us,de,no" XKBVARIANT="dvorak-intl,nodeadkeys," XKBOPTIONS="grp:caps_toggle"

    Read the article

  • Lubuntu 13.04 Resolution Issue du to ppa makson96

    - by Choupa
    I've installed Lubuntu 13.04 on my notebook. (It's the first time I use Lubuntu) I had graphical issue, apparently due to the graphic driver (ATI Radeon Xpress 1200) I followed this procedure to correct my problem : sudo add-apt-repository ppa:makson96 sudo apt-get update sudo apt-get upgrade sudo apt-get install fglrx-legacy My graphical issue has been corrected but now it shows a low resolution (1024*768) and I can't change it. the xrandr result : xrandr: failed to get size of gamma output default Screen 0: minimum 1024x768, current 1027x768, maximum 1024x768 default connected 1024x768+0+0 0mmx0mm 1024x768 0.0* I've already read this topic but I don't really understand how things work there. I'd really appreciate your help ! Many thanks !

    Read the article

< Previous Page | 206 207 208 209 210 211 212 213 214 215 216 217  | Next Page >