Search Results

Search found 71953 results on 2879 pages for 'work environment'.

Page 445/2879 | < Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >

  • Organizing Git repositories with common nested sub-modules

    - by André Caron
    I'm a big fan of Git sub-modules. I like to be able to track a dependency along with its version, so that you can roll-back to a previous version of your project and have the corresponding version of the dependency to build safely and cleanly. Moreover, it's easier to release our libraries as open source projects as the history for libraries is separate from that of the applications that depend on them (and which are not going to be open sourced). I'm setting up workflow for multiple projects at work, and I was wondering how it would be if we took this approach a bit of an extreme instead of having a single monolithic project. I quickly realized there is a potential can of worms in really using sub-modules. Supposing a pair of applications: studio and player, and dependent libraries core, graph and network, where dependencies are as follows: core is standalone graph depends on core (sub-module at ./libs/core) network depdends on core (sub-module at ./libs/core) studio depends on graph and network (sub-modules at ./libs/graph and ./libs/network) player depends on graph and network (sub-modules at ./libs/graph and ./libs/network) Suppose that we're using CMake and that each of these projects has unit tests and all the works. Each project (including studio and player) must be able to be compiled standalone to perform code metrics, unit testing, etc. The thing is, a recursive git submodule fetch, then you get the following directory structure: studio/ studio/libs/ (sub-module depth: 1) studio/libs/graph/ studio/libs/graph/libs/ (sub-module depth: 2) studio/libs/graph/libs/core/ studio/libs/network/ studio/libs/network/libs/ (sub-module depth: 2) studio/libs/network/libs/core/ Notice that core is cloned twice in the studio project. Aside from this wasting disk space, I have a build system problem because I'm building core twice and I potentially get two different versions of core. Question How do I organize sub-modules so that I get the versioned dependency and standalone build without getting multiple copies of common nested sub-modules? Possible solution If the the library dependency is somewhat of a suggestion (i.e. in a "known to work with version X" or "only version X is officially supported" fashion) and potential dependent applications or libraries are responsible for building with whatever version they like, then I could imagine the following scenario: Have the build system for graph and network tell them where to find core (e.g. via a compiler include path). Define two build targets, "standalone" and "dependency", where "standalone" is based on "dependency" and adds the include path to point to the local core sub-module. Introduce an extra dependency: studio on core. Then, studio builds core, sets the include path to its own copy of the core sub-module, then builds graph and network in "dependency" mode. The resulting folder structure looks like: studio/ studio/libs/ (sub-module depth: 1) studio/libs/core/ studio/libs/graph/ studio/libs/graph/libs/ (empty folder, sub-modules not fetched) studio/libs/network/ studio/libs/network/libs/ (empty folder, sub-modules not fetched) However, this requires some build system magic (I'm pretty confident this can be done with CMake) and a bit of manual work on the part of version updates (updating graph might also require updating core and network to get a compatible version of core in all projects). Any thoughts on this?

    Read the article

  • Problems with HP laptop

    - by Jan
    So the problem is...every time I reboot or shutdown laptop and when the laptop is booting again I get screen (before loading OS) that HP discovered overheating and system went into hibernate. But the point is that laptop is not overheating nor going into hibernate by itself. Also becouse of the hybrid graphic cards I cannot install additional drivers. Desktop resolution and all works perfectly but I cannot use unity 3D also openGL doesn't work as it should (with cairo docks) As i've read some posts, people say that switcheero doesn't work on 12.04 so I haven't tried it.

    Read the article

  • Pre-built Oracle VirtualBox Images

    - by james.bayer
    I’m thrilled to see that Justin Kestelyn has a post that pre-built Oracle VirtualBox images are now available on OTN.  There are VMs for various Oracle software stacks including one for Database, one for Java with Glassfish, and one for SOA and BPM products that includes WebLogic Server. This is just one example of the synergy of a combined Oracle and Sun delivering improvements for customers.  These VMs make it even more straight-forward to get started with Oracle software in a development environment without having to worry about initial software installation and configuration. I’ve been a bit quiet lately on the blogging front, but I’m currently working on another area leveraging the best of Oracle and Sun.  Oracle is uniquely positioned to deliver engineered systems that optimize the entire stack of software and hardware.  You’ve probably seen the announcements about Exalogic and I’m excited about the potential to deliver major advancements for middleware.  More to come…

    Read the article

  • How to port animation from one skeleton to another?

    - by shawn
    While I need to do this in a Blender3D modeler script, the math should be similar for other modelers or realtime engines. Blender3D specific terminology: Armature = skeleton EditBone = rest pose bone (stores the rest pose matrix) PoseBone = can store a different pose (animation matrix) for each frame of your animation I need to share animations (Blender Actions) between Armatures which have EditBones with same names and which have the same positions, but can have different (rest pose) angles and scales. Plus the Armatures might have different bone hierarchy (bone parenting/ no bone parenting). Why I need this: I've made an importer/exporter for a 3d format for a game. The format doesn't store enough info to connect/parent the bones, which makes posing/animating character models in a 3d modeller nearly impossible (original model files for the 3d modeler don't exist, this is for modding). As there are only 2 character skeleton types in the game, I decided to optionally allow to generate the bone from a hardcoded data in the model importer and undo that in the exporter. This allows to easily pose the model for checking weights, easily create weights, makes it easier for Blender to generate automatic weights and of course makes animating possible. This worked perfectly: the importer optionally generated the Armature itself and the exporter removed those changes, so the exported model works with existing animations in the game. But now I'm writing an importer and exporter for the game's animation format and here come the problems of: Trying to make original animations work in Blender with my "custom" (modified) Armature Trying to make animations created by using the "custom" (modified) Armature work with the original models in the game (and Blender). Constraints or bone snapping inside Blender won't work as they don't care that the bones have different angles in the rest pose, they will still face the same direction. It seems I just need to get the "difference" between the EditBone matrices of all EditBones for the two Armatures somehow and apply that difference to PoseBone matrices of all PoseBones, for all frames of my animation. I need to know how to get that difference and how to apply it. BTW, PoseBone matrices are relative to rest pose, they are by default [1.000000, 0.000000, 0.000000, 0.000000](matrix [row 0]) [0.000000, 1.000000, 0.000000, 0.000000](matrix [row 1]) [0.000000, 0.000000, 1.000000, 0.000000](matrix [row 2]) [0.000000, 0.000000, 0.000000, 1.000000](matrix [row 3]) So the question is: How to get the difference between two bone (EditBone) matrices to apply that difference to the animation matrices (PoseBone matrices)? Please be easy on the matrix math.

    Read the article

  • Dead keys in emacs with ibus

    - by Virgile
    I've just upgraded to 13.10 and noticed that dead keys are not working anymore in emacs (a keystroke to ' leads emacs to display <dead-acute> is undefined instead of waiting to the next key. In addition, use of the compose key leads to <Multi_key> is undefined and it is impossible to use keybindings such as <M-^>. Other applications work fine as far as I can tell. A brief search on the internet suggested to (require 'isotransl) to .emacs. This solves the first issue, but not the other ones. Another possible workaround seen on the web is to launch emacs with an empty XMODIFIERS variable, as XMODIFIERS='' emacs, instead of XMODIFIERS= @im=ibus which seems to be the default in 13.10. Then everything works fine, but it looks like a kludge. Is there a way to make emacs work with ibus on this subject?

    Read the article

  • How do I restrict my kids' computing time?

    - by Takkat
    Access to our computer (not only to the internet) needs to be restricted for the accounts of my kids (7, 8) until they are old enough to manage this by themselves. Until then we need to be able to define the following: the hours of the day when computing is o.k. (e.g. 5 - 9 pm) the days of the week when computing is not o.k. (e.g. mondays to fridays) the amount of time allowed per day (e.g. 2 hours) In 11.10 all of the following that used to do the job don't work any more: Timekpr: for 11.10 not available through the ppa. The installed version from 11.04 does not work in 11.10. Timoutd: command line alternative, but in 11.10 removed from the repositories. Gnome Nanny: Looks great but repeatedly crashes to force restarting X-server. So we can't use or recommed this program at the moment. Are there any other alternatives?

    Read the article

  • SQL Table stored as a Heap - the dangers within

    - by MikeD
    Nearly all of the time I create a table, I include a primary key, and often that PK is implemented as a clustered index. Those two don't always have to go together, but in my world they almost always do. On a recent project, I was working on a data warehouse and a set of SSIS packages to import data from an OLTP database into my data warehouse. The data I was importing from the business database into the warehouse was mostly new rows, sometimes updates to existing rows, and sometimes deletes. I decided to use the MERGE statement to implement the insert, update or delete in the data warehouse, I found it quite performant to have a stored procedure that extracted all the new, updated, and deleted rows from the source database and dump it into a working table in my data warehouse, then run a stored proc in the warehouse that was the MERGE statement that took the rows from the working table and updated the real fact table. Use Warehouse CREATE TABLE Integration.MergePolicy (PolicyId int, PolicyTypeKey int, Premium money, Deductible money, EffectiveDate date, Operation varchar(5)) CREATE TABLE fact.Policy (PolicyKey int identity primary key, PolicyId int, PolicyTypeKey int, Premium money, Deductible money, EffectiveDate date) CREATE PROC Integration.MergePolicy as begin begin tran Merge fact.Policy as tgtUsing Integration.MergePolicy as SrcOn (tgt.PolicyId = Src.PolicyId) When not matched by Target then Insert (PolicyId, PolicyTypeKey, Premium, Deductible, EffectiveDate)values (src.PolicyId, src.PolicyTypeKey, src.Premium, src.Deductible, src.EffectiveDate) When matched and src.Operation = 'U' then Update set PolicyTypeKey = src.PolicyTypeKey,Premium = src.Premium,Deductible = src.Deductible,EffectiveDate = src.EffectiveDate When matched and src.Operation = 'D' then Delete ;delete from Integration.WorkPolicy commit end Notice that my worktable (Integration.MergePolicy) doesn't have any primary key or clustered index. I didn't think this would be a problem, since it was relatively small table and was empty after each time I ran the stored proc. For one of the work tables, during the initial loads of the warehouse, it was getting about 1.5 million rows inserted, processed, then deleted. Also, because of a bug in the extraction process, the same 1.5 million rows (plus a few hundred more each time) was getting inserted, processed, and deleted. This was being sone on a fairly hefty server that was otherwise unused, and no one was paying any attention to the time it was taking. This week I received a backup of this database and loaded it on my laptop to troubleshoot the problem, and of course it took a good ten minutes or more to run the process. However, what seemed strange to me was that after I fixed the problem and happened to run the merge sproc when the work table was completely empty, it still took almost ten minutes to complete. I immediately looked back at the MERGE statement to see if I had some sort of outer join that meant it would be scanning the target table (which had about 2 million rows in it), then turned on the execution plan output to see what was happening under the hood. Running the stored procedure again took a long time, and the plan output didn't show me much - 55% on the MERGE statement, and 45% on the DELETE statement, and table scans on the work table in both places. I was surprised at the relative cost of the DELETE statement, because there were really 0 rows to delete, but I was expecting to see the table scans. (I was beginning now to suspect that my problem was because the work table was being stored as a heap.) Then I turned on STATS_IO and ran the sproc again. The output was quite interesting.Table 'Worktable'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'Policy'. Scan count 0, logical reads 0, physical reads 0, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0.Table 'MergePolicy'. Scan count 1, logical reads 433276, physical reads 60, read-ahead reads 0, lob logical reads 0, lob physical reads 0, lob read-ahead reads 0. I've reproduced the above from memory, the details aren't exact, but the essential bit was the very high number of logical reads on the table stored as a heap. Even just doing a SELECT Count(*) from Integration.MergePolicy incurred that sort of output, even though the result was always 0. I suppose I should research more on the allocation and deallocation of pages to tables stored as a heap, but I haven't, and my original assumption that a table stored as a heap with no rows would only need to read one page to answer any query was definitely proven wrong. It's likely that some sort of physical defragmentation of the table may have cleaned that up, but it seemed that the easiest answer was to put a clustered index on the table. After doing so, the execution plan showed a cluster index scan, and the IO stats showed only a single page read. (I aborted my first attempt at adding a clustered index on the table because it was taking too long - instead I ran TRUNCATE TABLE Integration.MergePolicy first and added the clustered index, both of which took very little time). I suspect I may not have noticed this if I had used TRUNCATE TABLE Integration.MergePolicy instead of DELETE FROM Integration.MergePolicy, since I'm guessing that the truncate operation does some rather quick releasing of pages allocated to the heap table. In the future, I will likely be much more careful to have a clustered index on every table I use, even the working tables. Mike  

    Read the article

  • Aser Aspire 3690 WIreless Not working, new Ubuntu user

    - by drew
    I don't know that much about Ubuntu, but I put it on an Acer Aspire 3690 Laptop (replaced Win Vista). Everything seems to work fine, but the wireless connection. It says "FIRMWARE MISSING". It detects that I have a wireless card, so I don't know what to do. I've already used the "ADDITIONAL DRIVERS" thing, and that didn't work. I've googled and have not found anything helpful. I don't really know what I'm doing, so can your directions please be kindergarden level? Thank you very much for any help you can provide. I really appreciate it.

    Read the article

  • Can't boot Ubuntu 12.10 32 or 64 Bit, only Ubuntu 12.04 32 Bit [closed]

    - by Alexander
    Possible Duplicate: My computer boots to a black screen, what options do I have to fix it? i tried to install Ubuntu 12.04 64Bit, 12.10 32 and 64Bit, but it doesn't work. I'm used the Ubuntu 12.04 32Bit Start Disc Creator and also Unetboot on Win7, the installation-process are finished and i restart without the Stick. I can choose for example 12.10 and it starts writing "start ... [OK], ...", but then it hangs most on "Stop Kernel Messages [OK]". Then i can only shutdown normal the system and it writes stopping, shutdown and something like that. I am use an Aspire One D270 Netbook with Intel Atom N2600. It also doesn't work to try Ubuntu 12.10 from running on USB Stick. It starts, but then its black and the cursor blink on the left upside. Please can you help me? :(

    Read the article

  • How do I run a 64-bit guest in VirtualBox?

    - by ændrük
    I would like to have an Ubuntu 11.04 64-bit test environment. When I try booting the Ubuntu 11.04 64-bit installation CD in VirtualBox, the following message is displayed by VirtualBox: VT-x/AMD-V hardware acceleration has been enabled, but is not operational. Your 64-bit guest will fail to detect a 64-bit CPU and will not be able to boot. Please ensure that you have enabled VT-x/AMD-V properly in the BIOS of your host computer. What am I doing wrong? Details: VBox.log, ubuntu-test.vbox, and /proc/cpuinfo. Kernel: Linux aux 2.6.38-8-generic #42-Ubuntu SMP Mon Apr 11 03:31:24 UTC 2011 x86_64 x86_64 x86_64 GNU/Linux The Virtualization setting in the BIOS is set to Enabled.

    Read the article

  • Wearables and UX Innovation: En Español y Inglés

    - by ultan o'broin
    Good examples of Oracle's commitment to tech diversity and to innovation can be seen everywhere. Here's a couple of videos from the Oracle Applications User Experience (UX) team, featuring Sarahi Mireles (@sarahimireles) and Noel Portugal (@noelportugal) who work together on some very cool stuff. The videos are available on the Oracle Technology Network Architecture (OTNArchBeat) Community Video Channel on YouTube. Sarahi and Noel show you how cool people work together on some awesome innovations, worldwide. Sarahi Mireles showing off a Spanish language Pebble watch Facebook notification. The videos are in Spanish and English and feature the latest in wearable technology that the UX team is exploring and that UX team members themselves love to use. Check out what they have to say in your preferred language. Manos libres y vista al frente: Con el futuro puesto Heads Up and Hands Free: Wearing the Future Interested in knowing more or joining us? Find out more on Facebook about the Oracle Applications User Experience team and the Oracle Mexico Development Center.

    Read the article

  • Claims-based Identity in .NET 4.5 and Windows 8

    - by Your DisplayName here!
    There was not a ton of new information about WIF and related technologies at Build, but Samuel Devasahayam did a great talk about claims-based access control that contained some very interesting bits of information with regards to future directions. From his slides: Windows 8 Bring existing identity claims model into the Windows platform Domain controller issues groups & claims Claims (user and device) sourced from identity attributes in AD Claims delivered in Kerberos PAC NT Token has a new claims section Enhanced SDDL API’s to work with claims Enhanced user mode CheckAccess API’s to work with claims New ACL-UX Target audits with claims-based expressions WIF & .NET 4.5 WIF is in the box with .NET Framework 4.5 Every principal in .NET 4.5 is a ClaimsPrincipal ADFS 2.1 ADFS 2.1 is available now as a in-box server role in Windows 8 Adds support for issuing device claims from Kerberos ticket

    Read the article

  • Blu-ray player?

    - by Dox
    I'd like to play bluray discs in my laptop. I found the official documentation, and there it's explained that one should use mplayer and ffmpeg. Looking at the repositories, there exist two different mplayer packages (in conflict with each other) mplayer mplayer2 Any ideas with of them should I install? On the other hand the official documentation seems to be out of date since no mention to Ubuntu after 9.04 is done. Does the DumpHD package from the repositories work? Finally, Where could the keydb.cfg keys be found? I'm open to suggestions, specially of people who had done the job of making it work. Cheers

    Read the article

  • SQL Server 2008 Service Pack 1 and the Invoke or BeginInvoke cannot be called error message

    - by Jeff Widmer
    When trying to install SQL Server 2008 Service Pack 1 to a SQL Server 2008 instance that is running on a virtual machine, the installer will start:   But then after about 20 seconds I receive the following error message: TITLE: SQL Server Setup failure. ----------------------------- SQL Server Setup has encountered the following error: Invoke or BeginInvoke cannot be called on a control until the window handle has been created. ------------------------------ BUTTONS: OK ------------------------------ Searching for this issue I found that several people have the same problem and there is no clear solution.  Some had success with closing windows or Internet Explorer but that didn’t work for me; what did work is to make sure the SQL Server 2008 “Please wait while SQL Server 2008 Setup processes the current operation.” dialog is selected and has the focus when it first shows up.  Selected (with the current focus) it looks like this:   Without focus the dialog looks like this: Add a comment if you find out any information about how to consistently get around this issue or why it is happening in the first place.

    Read the article

  • Good Scoop: The PeopleSoft/IBM Backstory

    - by [email protected]
    Sometimes you're searching for something online and you find an unrelated, bonus nugget. Last week I stumbled across an interesting blog post from Chris Heller of a PeopleSoft consulting shop in San Ramon, CA called Grey Sparling. I don't know these guys. But Chris, who apparently used to work on the PeopleTools team, wrote a great article on a pre-acquisition, would-be deal between IBM and PeopleSoft that would have standardized PeopleSoft on IBM technology. The behind-the-scenes perspective is interesting. His commentary on the challenges that the company and PeopleSoft customers would have encountered if the deal had gone through was also interesting: ·         "No common ownership. It's hard enough to get large groups of people to work together when they work for the same company, but with two separate companies it is much, much harder. Even within Oracle, progress on Fusion applications was slow until Thomas Kurian took over Fusion applications in addition to Fusion middleware." ·         "No customer buy-in. PeopleSoft customers weren't asking for a conversion to WebSphere, so the fact that doing that could have helped PeopleSoft stay independent wouldn't have meant much to them, especially since the cost of moving to whatever a "PeopleSoft built on WebSphere" would have been significant." ·         "No executive buy-in. This is related to the previous point, but it's worth calling out separately. If Oracle had walked away and the deal with IBM had gone through, and PeopleSoft customers got put through the wringer as part of WebSphere move, all of the PeopleSoft project teams would be put in the awkward position of explaining to their management why these additional costs and headaches were happening. Essentially they would need to "sell" the partnership internally to their own management team. That's not a fun conversation to have." I'm not surprised that something like this was in the works. But I did find the inside scoop and Heller's perspective on the challenges particularly interesting. Especially the advantages of aligning development of applications and infrastructure development under one roof. Here's a link to the whole blog entry.  

    Read the article

  • How do I get my mac to boot from an Ubuntu USB key?

    - by Vinay Gupta
    so if you select "mac" and "usb" on this download page, it gives a series of command line instructions to make a USB key which the MacBook will boot into Ubuntu from. http://www.ubuntu.com/desktop/get-ubuntu/download I've followed them to the letter two or three times on different USB keys, and it doesn't work. There's a very great deal of technical discussion about EFI etc. but this set of instructions seems to suggest it should Just Work, and it doesn't. Help? I'm increasingly unhappy with the more locked-down approach Apple is taking, and I'd quite like to start using Linux with a view to transitioning over to using it as my main operating system, but booting from the CD takes forever, runs slowly and I'm really hoping to get it moving off USB. Can anybody help me?

    Read the article

  • Vdpau performance in Precise with Unity 3d

    - by bowser
    vdpau seems to be broken in Precise under Unity 3d. CPU usage ranges around 50-70% for 1080p movies while same movies utilizes around 5-10% in Natty with vdpau enabled (under Unity3d) The card is Nvidia G105m. It doesn't seem to be a Nvidia driver problem because in gnome-shell everything works as expected and I have tried different versions of Nvidia drivers (295.20, 295.33, 295.40 and the latest 302.XX from xorg-edgers) The results are all the same, works in Gnome Shell but not in Unity 3d. Disabling syn to vbank works if movie is not in full screen mode, but it doesn't work for full screen. I have searched around and haven't found much info. I am wondering if others are experiencing the same problem and if there are some known work around that I have missed. Unity 3d is otherwise very nice in Precise, but this is a show stopping issue for me (literally). Thanks. I have filed a bug here https://bugs.launchpad.net/unity/+bug/993397

    Read the article

  • How to get Visual Studio 2012 working on Ubuntu 12.04.1?

    - by kamil
    I am trying to get visual studio working in Unity (via Wine) without using any virtual machine or other Desktop Environment alternatives. I am convinced Visual Studio is the ultimate IDE for .Net programming languages. I'm not necessarily for dual booting. I have been working more than 10 years on visual studio and I prefer it over other IDEs. I have tried other IDEs but they didn't work too well for me. Does anyone know a way to get this working natively?

    Read the article

  • Group arrival steering

    - by ltjax
    I've got group movement implemented pretty much like this: http://www.red3d.com/cwr/steer/CrowdPath.html Basically, that's combining path following and separation. It works nicely as long as units are in transit, but arrival does not work very well at all. Right now, units just cease to use the path following component once the "exit" the path, i.e. when their closest point on the path is on or past the end. This leads to those units bumping into each other and also overshooting the point the player clicked. Ideally, I'd have the units arrive scattered around the finish point (and reasonable close to each other), not all clumped up past the finish line. I'd imagine that some kind of arrival steering might work here, but based on other units and a "fuzzy" classification of the end of the path. Is there any proven way to do this?

    Read the article

  • Develop website locally and push updates on Remote Server using Git

    - by John
    Together with a friend we are looking to develop a website (using Symfony2). We are on a Shared Hosting with SSH access. Below is the environment we'd like to setup: * Use git as Version Control (we are new to Git) * Share the tasks and develop on our local machines * Push the updates onto the remote server Here's our initial thoughts on how to do it (assuming Git is already running both locally and remotely): * Install Symfony on the Remote Server (basic setup) * Get a clone (using Git) of the project locally * Develop project locally and push updates (using Git) on the remote server Does this approach make sense, if not, any recommendations? Thanks

    Read the article

  • How can I get six Xbox controllers to provide input to an HTML5 game?

    - by Daniel X Moore
    I'm creating a six player HTML 5 game designed to be played locally (Red Ice). I've previous set up handling 7 Wiimotes using something along the lines of Joy2Key to map each input for each player to a separate keyboard key, but Wiimotes are pretty hard on the hands for these types of games and not very ergonomic so I thought I'd try and get Xbox controller support. I don't believe that any simple key mapping solution will work due to the nature of the directional stick. My inclination is that this will require a browser plugin and if so I'd prefer to write the plugin for Google Chrome. How do I create a Chrome browser plugin to handle multiple Xbox controllers or is there some other way? Please do not answer this question saying it can't be done, because it absolutely can. EDIT: I don't believe any keymapping/mouse simulating solution will work unless it can reliably distinguish six axis of inputs, one per player.

    Read the article

  • Script at Startup

    - by OttoRobba
    I'm using 10.10 and I need to run a script in order to get a windows-like international keyboard layout - basically, it changes how dead keys work. (Original script from this page http://t.tam.atbh.us/en/win-us-intl-4-linux/ ) Since I can't seem to manage to get it going from boot, I have to run a custom script to launch any application. The script: export GTK_IM_MODULE=xim setxkbmap us intl xmodmap -e 'keycode 48 = dead_acute dead_diaeresis dead_acute dead_diaeresis acute diaeresis' application_name So if I put abiword in the application_name, it runs abiword respecting the keyboard script. Ideally, the original script would start at boot and then any applications I use would function with it - just like what happens if I run it first in Terminal (without the app_name line) and then run apps from it. I tried to make the script run from boot by adding it to /etc/rc.local but to no avail. Tried to add it to init.d but that also didn't work. If anyone can help, I'd be most grateful.

    Read the article

  • How to enable true remote login

    - by Scán
    I don't quite know how these things are called, so a search did not product any help. I've got two computers, a desktop and a netbook. The netbook is really weak, and there's hardly any fun doing work with it, especially after ubuntu software swallows so much cpu power for nothing. But my desktop is good, but uncomfortably positioned. So I know you can use any linux system as a server to give logins. I want to be able to login and work on my desktop, from my netbook. No VNC, no SSH, full X-server, I want to be able to choose "Login on Desktop" in my login menu on the netbook and have everything as if I was there. I hope I could make my point. Is it possible in a local network? And if so, how can I easily set it up?

    Read the article

< Previous Page | 441 442 443 444 445 446 447 448 449 450 451 452  | Next Page >