Search Results

Search found 23808 results on 953 pages for 'c source'.

Page 583/953 | < Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >

  • Can a named (bind) crash make a server unreachable?

    - by giorgio79
    My server recently became unreachable, and after restart a named error was the last line I found in /var/log/messages before restart: Jun 26 00:15:06 host named[1303]: error (network unreachable) resolving 'dlv.isc.org/DNSKEY/IN': 2001:500:71::29#53 Jun 26 06:38:55 host kernel: imklog 5.8.10, log source = /proc/kmsg started. Jun 26 06:38:55 host rsyslogd: [origin software="rsyslogd" swVersion="5.8.10" x-pid="1294" x-info="http://www.rsyslog.com"] start Jun 26 06:38:55 host kernel: Initializing cgroup subsys cpuset Can a named crash make a server unreachable? I doubt it, as I assume I should still be able to login with ssh via IP, but the server did not respond...So, I am trying to make heavy guesses here.

    Read the article

  • Automatic generate code: "derived work"?

    - by Peregring-lk
    For example, I've GPL software. I'm the author of this GPL software. This GPL software has, between its code, Doxygen comments. These Doxygen comments are written to generate a CC-BY-SA html page, in order to upload this generated documentation in my project website under CC-BY-SA license. But, the Doxygen documentation output is a "derivate work"? After all, this documentation is based on my GPL source code. In this case, the documentation must be GPL. But, I want the documentation is CC-BY-SA, because it is documentation. GFDL doesn't help. GPL code can't become GFDL (the opposite yes). If this output is really a derivate work, I think, creates a strange situation, because, if I distribute my work, the recipient users can't legally distribute the generated documentation: while with my work I can do I want, the users don't, thus, they have to distribute any derivated work with the same license I offer them. What is the solution?

    Read the article

  • Photo transfer problems from camera

    - by warkior
    We have a digital camera (Cannon SX130 IS) which we often connect to the Ubuntu 12.10 desktop via USB in order to download the images. In past flavours of Linux (Mint 12 was most recent) it worked fine, however since upgrading to Ubuntu 12.10, the process fails after downloading a small number of the images. I can view the images which will be transferred in the preview window, and I can browse the camera file system to download the images manually, but if I just drag/drop the images over from camera to desktop, it freezes after 5-6 are copied over. I've been able to get around the problem by only copying 3-4 at a time, but when you have 100+ images to transfer, that gets really frustrating. Any advice on where I could start looking for answers, or how I could diagnose the source of the problem further? We have also had some issues with WireLess USB mice though it may not be related. I'm hoping my USB controller in the computer isn't dying... it's not that old. Also, it seems to work much better under Windows.

    Read the article

  • Linux: Alternative to rsync? (ie, scp with resume)

    - by Joernsn
    I've been using rsync to automatically send files from one box to another, which is great compared to scp, since it supports resuming. However, when resuming a very large file (10gb) rsync has to read both files and compare them, which is very slow. I don't need fancy error handling, just "scp with resume", so here's my question: Is there an alternative to rsync/scp, that supports resuming without having to read both source and destination files? I've read the manuals without finding anything I can use, please let me know if I've missed something. This is the rsync line I've been using: rsync -av --partial --progress --inplace SRC DST

    Read the article

  • What are your "must have", free (gratis), programs?

    - by flybywire
    Poll: What software must you always keep handy? I don't care if it is open source, freeware, or demo, as long as its price is $0. Neither do I care if it is for desktops, handhelds, netbooks, web based, cellphones. If it is free to use – and essential to your happiness and well-being – put it in this list. Rules: Please, list only ONE application per answer, so that people can vote up the items that they prefer. Please do not post applications that have already been posted - instead, up-vote the existing answer.

    Read the article

  • Ubuntu 12.04 Freezes w/ Ethernet Unplugged + Wireless Drops (Acer Aspire 5516)

    - by Grand Master T
    Ubuntu 12.04-12.10 32/64 freezes or won't boot if the Ethernet cable is unplugged and will not hold a wireless connection. Here is my scenario... Laptop: Acer Aspire 5516 Wireless card: Broadcom BCM4312 Ubuntu 12.04 32/64 Issues Unity 3d won't load without the Ethernet cable plugged in. If I let it load with Ethernet plugged in, it will freeze once I disconnect the cable. Unity 2d will load without the Ethernet cable plugged. In Unity 2d, wireless cannot hold a connection. I can connect to a Wireless network, but when I try to use it (i.e. open a browser), it disconnects. I can reconnect by disabling wireless (uncheck Enable Wireless), re-enable wireless, and reconnect. But, it will disconnect again once I start using it. Ubuntu 12.10 Issues Since 12.10 only gives me the option to load 3d (I assume), I experience the same thing as the first issue in 12.04. Attempted Solutions Enable networking/LAN in BIOS Set LAN first in boot priority in BIOS Remove STA wireless driver (bcmwl-kernel-source) and install b43 low power driver (firmware-b43-lpphy-installer). Remove default Network Manager and install Wicd. So far, I have had no luck with fixing this issue. Does anyone have any further suggestions?

    Read the article

  • How to market R at your institute?

    - by ran2
    Okay, I admit there are lots of threads R vs. something. The strengths of R are obvious to most people here. Still though advertising R in an environment that has been preferring various kinds of other software for quite some time is not easy. Moreover, even in the limited time I´ve been dealing with R, it improved so dramatically that I would mention things among its strengths that I would not have listed when I started my personal R-evolution. So, what I am trying to do here is to collect the most recent and striking arguments that can be put in nutshell and be presented easily. What I got on my list so far is: the Springer useR series ggplot2 and its documentation open source CRAN Rapache rcpp rsocket What can you add to this list? SO threads are also very welcome as answers. EDIT: so far, though indeed very helpful, most answers are arguments (pros) why one would want to use R. Do you have some specific hints that I could include in some kind of overview presentation? EDIT2: I wanted to add this link about R's future to the list...

    Read the article

  • Advanced merge directory tree with cp in Linux

    - by mtt
    I need to: Copy all of a tree's folders (with all files, including hidden) under /sourcefolder/* preserving user privileges to /destfolder/ If there is a conflict with a file (a file with the same name exists in destfolder), then rename file in destfolder with a standard rule, like add "old" prefix to filename (readme.txt will become oldreadme.txt) copy the conflicted file from source to destination Conflicts between folders should be transparent - if same directory exists in both sourcefolder and destfolder, then preserve it and recursively copy its content according to the above rules. I need also a .txt report that describes all files/folders added to destfolder and files that were renamed. How can I accomplish this?

    Read the article

  • Is it possible to have zsh+keychain+tmux not ask for keys?

    - by Wayne Werner
    I'm using tmux and zsh, and I've recently been learning about ssh-agent and keychain. From the manpage for zsh, it says that it will source .zlogin only if the shell is, well, a login shell. Following advice I read, I stuck keychain --clear in my .zlogin, which worked perfectly. When I logged into the box I had to unlock my key. However, each time I create a new window in tmux, it clears/makes me re-add my key. This is a little annoying... but I can understand it if, in fact, each new tmux window is a login window. I haven't been able to find much help outside of the manpages on this topic. So is each new tmux window a login shell, or is there any way that I can make it not clear my keys only when I create a tmux window?

    Read the article

  • Do I need a license to create pdf files? [closed]

    - by Fire-Dragon-DoL
    I hope this is the correct place where I could ask this question. My mother is an accountant with a degree in economics. She works as a freelancer and she needs some licenses for her job. The biggest problem is adobe acrobat standard, which costs 400€, quite a lot. I want understand if she must buy it to create pdf files or she can use some free (even for commercial use) programs that she has because of her job (the chamber of commerce provide some advantages to accountants). She is actually using PDFCreator, which as I can read is free for business usage (open source also!!): http://sourceforge.net/projects/pdfcreator/ Thanks for any suggestion

    Read the article

  • proxy software that supports parallel transfer

    - by est
    Hi guys, I need to setup a really fast proxy server in a remote server, here's the scenario: The server prefetches 3KB of data, mostly HTTP resources. The server send to client 3KB of data, instead of traditional HTTP or SOCKS proxy, the server open multithreaded transfer with 3 connections, send 1KB of data per thread to each connection The client receives 1KBx3, and combine them to the original 3KB data, and return as a local HTTP proxy server. The client display the original data in browser via the local HTTP proxy The latency is not important as long as the transfer rate is good. Does any software like this exist? It's better if it's open source or free ones.

    Read the article

  • Iptables NAT logging

    - by Gerard
    I have a box setup as a router using Iptables (masquerade), logging all network traffic. The problem: Connections from LAN IPs to WAN show fine, i.e. SRC=192.168.32.10 - DST=60.242.67.190 but for traffic coming from WAN to LAN it will show the WAN IP as the source, but the routers IP as the destination, then the router - LAN IP. I.e. SRC=60.242.67.190 - DST=192.168.32.199 SRC=192.168.32.199(router) - DST=192.168.32.10 How do I configure it so that it logs the conversations correctly? SRC=192.168.32.10 - DST=60.242.67.190 SRC=60.242.67.190 DST=192.168.32.10 Any help appreciated, cheers

    Read the article

  • include all vim files in a folder

    - by queueoverflow
    For my .bashrc I have a lot of small snippet files in .config/bash, like 10-prompt.sh and so on. In my actual .bashrc, I just have the following: configdir="$HOME/.config/bash" for file in "$configdir"/*.sh do source "$file" done I'd like to do the same for my .vimrc, but I am not that confident in VimL that I could write that. How would the snippet for .vimrc look like that includes all the snippets in a given subfolder? Ideally, I'd like to make a .vim/rc/ folder where I can put my snippets into.

    Read the article

  • Advice for migrating email server

    - by Chris Adams
    Hi there, I'm planning to migrate a Zimbra server with about 200gb of data from a server hosted in an office into a datacentre, to increase uptime (we've had a couple of outages when our network here started flaking out, and we have people in other countries relying on this server too). However, I'm not sure how best to migrate the data into the data centre without rendering the connection unusable during office hours, because there's far too much to send in over night over the two meg upstream connection we have here. I'm familiar with using tools like nice to stop a long running process degrading machine performance - is there a simple way to throttle a connection between office hours, so the long running transfer doesn't block the pipe, but then opens up outside of office hours to make the most of the bandwidth? I'm aware the alternative here is to simply mail a hard drive to the data centre, but I'd like to avoid doing that if I could. We're using Centos Linux for our servers, in the office and the datacentre, so extra points for an open source linux answer.

    Read the article

  • I want to build an debian apt site for local LAN updates

    - by user73504
    Hi, I have downloaded all debian's DVD disks, and I have set up apache httpd service. I combined all dvd disk 's file, but I found the .gpg file I need and I can't create it. it looks like source's signature file. so when I set my /etc/apt/sources.list file as follow: deb http://192.168.1.102/apt/debian squeeze main contrib it noticed me the gpg files verilied faild. so I want to know , how to create gpg file, and do I need some other work except put DVD's file to the apache's htdocs path?

    Read the article

  • mosh-like port forwarding

    - by Marc Merlin
    This is on linux, connecting to linux servers: I love mosh, but it doesn't support port forwarding, and likely won't for a while since it's been almost a year now and it hasn't happened yet. port forwarding over ssh is great, but because my laptop moves between networks several times a day, my ssh sessions die, and so do the port forwards. I could script/hack something to detect hung ssh and reconnect to get my port forwards back, but before I do this, is there another way to do long lasting port forwards when your source IP changes several times daily (because you go on different networks)? I'm thinking an ssh over UDP would do the trick but of course ssh is over TCP.

    Read the article

  • How can I troubleshoot SSD disconnection problems on Windows 7?

    - by 0xC0000022L
    I have a small SSD (Transcend StoreJet TS256GESD200K) which I am using on several computers. The drive is formatted with NTFS and recently I started noticing disconnects. Normally one probably wouldn't notice (you don't even get the notification sound in Windows when it disconnects), but since I use USBDLM and run a VM from that SSD, I get to see this first-hand. How could I best figure out whether the problem is the hardware (SSD, USB socket, USB cable or something in the PC) or software? In short: how can I locate the source of the disruption so that I can work on removing the problem? Side-note: SMART status for the SSD is clean.

    Read the article

  • 3D physics engine for accurate collision handling on desktop/laptop computers (non-console)

    - by Georges Oates Larsen
    What are your suggestions for a physics engine that satisfies the following criteria? Capable of calculating collisions between multiple concave mesh-based colliders Handles many collisions going on at once (for instance one mesh being wedged between two others, which themselves may be wedged between two meshes) Does not allow for collider passthrough, even at high speeds. For instance, if I am applying force to a programmatically hinged object that makes it spin, I do not want it to pass through another rigidbody that it collides with while spinning. I have this problem using PhysX As implied before, reacts well to hinged objects, preferably has its own implementation of a hinge, but I am willing to program my own. The important part is that it has some sort of interface that guarantees accurate collision tracking even when dealing with these things Platform independent -- runs on mac as well as PC, also not tied down to specific graphics cards I think that's the best way to explain what I am looking for. Basically, I need SUPER reliable collisions. Something that can't be accomplished with a simple ray casting approach that sends a ray from the last position of the object to the current position (as this object may be potentially large and colliding with small objects via rotation) Bonus points for also including an OPEN SOURCE engine.

    Read the article

  • Tool that automatically keeps old versions of a file? Shadow Copy in Win7?

    - by Michael Stum
    When I'm working with a Graphics App, I press CTRL+S a lot to Quicksave. Sometimes, I just went too far and made a bad decision, sometimes to the point Undo wouldn't help either. I would love to retain old versions of a file. Normally, Source Control would be of use here, but that's a manual process (same as just making some copies). I wonder if there is an automatic way to do that? Everytime the file changes, keep a backup. I believe that in Windows Server, Shadow Copies can do that. When I check in my Windows 7 (Ultimate), I do see "Previous Versions" as a tab, but that seems to be part of the backup function which is once again manual. Is there a way to get that type of automatic versioning?

    Read the article

  • Installation causing broken packages

    - by AWE
    Here I come I am so determined to use Ubuntu that I paid a professional to install it for me (dualboot). When I got it I got a lot of things from the software center. Skype did not have a download button so I googled it and Ubuntu help told me to do this: sudo add-apt-repository "deb http://archive.canonical.com/ $(lsb_release -sc) partner" and then this: sudo apt-get update && sudo apt-get install skype The terminal told me "that this is potentially harmful..." but I thought it was Ubuntu language meaning "are you sure?" Now items cannot be installed or removed until the package catalog is repaired, so I want to repair it but the package operation fails. "sudo aptitude -f install" - command not found Synaptic package manager tells me that I have two broken packages, libc6 and libc6-dev but doesn't help, only makes life complicated. What the *#$%&!!! I don't want to be forced to become a computer scientist just to be able to use a free source os. P.s. the sound stopped working

    Read the article

  • How can I get my monitor's maximum resolution without the proprietary AMD graphic driver installed?

    - by Venki
    I am using Ubuntu 14.04. I have an AMD Radeon 5570 HD graphic card. Actually, the default open source REDWOOD drivers aren't allowing me to choose my monitor's maximum screen resolution(which is 1366 x 768). I just have two resolutions displayed which are 1024x768 and 800x600 . If I give the command : xrandr -s 1366x768 then the output is: Size 1366x768 not found in available modes So just for the sake of getting 1366x768 resolution I am forced to install the proprietary graphic driver that AMD gives me from its site. But if I install it(which itself is quite a problem-prone process), I undergo a lot of 'inconvenience'. Sometimes after an OS update, the driver crashes unity. Then I will have to uninstall that driver from a tty and google around for a solution. Also I encounter screen tearing problems occasionally. In addition I also cant see my login screen(See this question which states this particular problem). The main problem is AMD does not update its driver as quick as Ubuntu updates its OS. This is quite irritating. So, I want the maximum resolution(and performance) that my graphics card and monitor can give me without installing the 'problematic' proprietary graphic card driver that AMD gives. Is this possible? Suggestions please. Thanks in advance. PS :- More system specs details:- Intel i3 2100 processor AMD P8H61-M PLUS2 motherboard AMD Radeon 5570 HD graphic card DELL monitor (BTW, Thank you for reading through my elaborate description!)

    Read the article

  • Mercurial says "nothing changed", but it did. Sometimes my software is too clever.

    - by user12608033
    It seems I have found a "bug" in Mercurial. It takes a shortcut when checking for differences in tracked files. If the file's size and modification time are unchanged, it assumes its contents are unchanged: $ hg init . $ cp -p .sccs2hg/2005-06-05_00\:00\:00\,nicstat.c nicstat.c $ ls -ogE nicstat.c -rw-r--r-- 1 14722 2012-08-24 11:22:48.819451726 -0700 nicstat.c $ hg add nicstat.c $ hg commit -m "added nicstat.c" $ cp -p .sccs2hg/2005-07-02_00\:00\:00\,nicstat.c nicstat.c $ ls -ogE nicstat.c -rw-r--r-- 1 14722 2012-08-24 11:22:48.819451726 -0700 nicstat.c $ hg diff $ hg commit nothing changed $ touch nicstat.c $ hg diff diff -r b49cf59d431d nicstat.c --- a/nicstat.c Fri Aug 24 11:21:27 2012 -0700 +++ b/nicstat.c Fri Aug 24 11:22:50 2012 -0700 @@ -2,7 +2,7 @@ * nicstat - print network traffic, Kb/s read and written. Solaris 8+. * "netstat -i" only gives a packet count, this program gives Kbytes. * - * 05-Jun-2005, ver 0.81 (check for new versions, http://www.brendangregg.com) + * 02-Jul-2005, ver 0.90 (check for new versions, http://www.brendangregg.com) * [...] Now, before you agree or disagree with me on whether this is a bug, I will also say that I believe it is a feature. Yes, I feel it is an acceptable shortcut because in "real" situations an edit to a file will change the modification time by at least one second (the resolution that hg diff or hg commit is looking for). The benefit of the shortcut is greatly improved performance of operations like "hg diff" and "hg status", particularly where your repository contains a lot of files. Why did I have no change in modification time? Well, my source file was generated by a script that I have written to convert SCCS change history to Mercurial commits. If my script can generate two revisions of a file within a second, and the files are the same size, then I run afoul of this shortcut. Solution - I will just change my script to apply the modification time from the SCCS history to the file prior to commit. A "touch -t " will do that easily.

    Read the article

  • Multiple foldmethods in vim

    - by bjarkef
    I use the folding option of vim quite a lot, and have usually set foldmethod to syntax. Recently I discovered that it is possible to add custom folds, such that I can put whole blocks in /*{{{*/ and /*}}}*/ which is very useful for grouping large sections of a source file together. However to use that feature I need to set foldmethod to marker, and I loose the syntax folding. Is it possible to have two active foldmethods at the same time in vim? set foldmethod=syntax,marker does not work.

    Read the article

  • How to customize Windows 7 HAL library during installation (BSOD STOP: 0x000000A5)?

    - by koldovsky
    While trying to install Windows 7 x86 Ultimate on Samsung M40 laptop (Pentium M 1.7 Dothan, 2 GB RAM/ 100 GB HDD) i receive BSOD STOP: 0x000000A5: The bios in this system is not fully ACPI compliant. Please contact your system vendor for an updated bios. The bios on the system is updated to latest version. If ACPI is the real source of the issue it means that I possibly could use another HAL library. In Windows XP it is possible to install system with generic HAL library pressing F7 when installer asks to supply drivers, but on Windows 7 I can't find such option. Ironically, Vista installs and works nice, even if they said that Windows 7 is less demanding for hardware. Windows 7 Advisor also tells nothing suspicious. Can anybody tell me how to customize Windows 7 installer to use generic HAL library (if it is possible, of course) or point me to another solution?

    Read the article

  • ArchBeat Link-o-Rama for 2012-09-26

    - by Bob Rhubart
    Oracle Introduces Free Version of Oracle Application Development Framework Several community bloggers have already written about Oracle Application Development Framework (ADF) Essentials, the free version of Oracle ADF. Here's the official press release. ADF Essentials - Quick Technical Review | Andrejus Baranovskis "This post is just a quick review for ADF Essentials on Glassfish," says Oracle ACE Director Andrejus Baranovskis. "I will do a proper performance test soon to compare ADF performance on 5 ways to think like a cloud architect | ZDNet "Is enterprise architecture ready for the cloud? Is the cloud ready for EA?" Joe McKendrick asks. "Cloud represents a different way of thinking. But we've been here before." Configuring trace file size and number in WebCenter Content 11g | Kyle Hatlestad A quick tip from Oracle Fusion Middleware A-Team member Kyle Hatlestad. Thought for the Day "Elegance is not a dispensable luxury but a factor that decides between success and failure." — Edsger W. Dijkstra (May 11, 1930 – August 6, 2002) Source: SoftwareQuotes.com

    Read the article

< Previous Page | 579 580 581 582 583 584 585 586 587 588 589 590  | Next Page >