Search Results

Search found 40479 results on 1620 pages for 'binary files'.

Page 459/1620 | < Previous Page | 455 456 457 458 459 460 461 462 463 464 465 466  | Next Page >

  • bootstraping a SparkleShare project

    - by WoJ
    I just tried SparkleShare as a possible replacement for dropbox/insynch. It looks quite promising, being based on open standards. I was wondering if someone has gone though the process of "bootstraping" a SparkleShare project. I have the initial files I would like to keep synchronized on two clients and the server (as plain files). I was wondering if there would be a way to set a project up so that I would not need to download/upload all the files back and forth (as they are readily available on all three systems). I guess this would involve some git kung-fu I am far from mastering. Thanks!

    Read the article

  • Linux: Alternative to rsync? (ie, scp with resume)

    - by Joernsn
    I've been using rsync to automatically send files from one box to another, which is great compared to scp, since it supports resuming. However, when resuming a very large file (10gb) rsync has to read both files and compare them, which is very slow. I don't need fancy error handling, just "scp with resume", so here's my question: Is there an alternative to rsync/scp, that supports resuming without having to read both source and destination files? I've read the manuals without finding anything I can use, please let me know if I've missed something. This is the rsync line I've been using: rsync -av --partial --progress --inplace SRC DST

    Read the article

  • Hyperlink to doc file slow opening

    - by mserioli
    I've two excel file with inside some link to .doc and .pdf file. Both excel files and linked files are on a network shared folder. The first excel file is an .xls, the second an .xlsm. While opening link to .pdf file is very fast (the file is open in few seconds) it take a long time to open .doc files (about 40 secs.). I have searched on internet but found no solution at the moment. I have this problem with both excel 2007 and 2010. Does anyone know how to solve this problem? Thanks a lot Marco

    Read the article

  • How can I view a PDF in Firefox when the server specifies the wrong content type?

    - by Sam
    I am using Mozilla Firefox with a PDF viewer plug-in. The plug-in has been correctly associated with Adobe Reader files to view them in the browser in the settings. I would like to be able to view PDF files in Firefox rather than downloading them. This already works correctly when a web server indicates that a file has the Content-Type of application/pdf. However, some web servers provide other Content-Types for PDFs, such as application/octet-stream. (See this example of a PDF served with a non-pdf Content-Type.) I have looked at Firefox's MimeTypes.rdf file, and it appears to only support mapping applications based on file types for non-Internet-based files. How can I have Firefox view all PDF documents in-browser rather than only the ones with the application/pdf Content-Type?

    Read the article

  • Setup CENTOS Centralized AUDIT and RSYSLOG server

    - by Warron.French
    Attempting to use these links: Sending audit logs to SYSLOG server or http://wiki.rsyslog.com/index.php/Centralizing_the_audit_log I have been unable to get centralized AUDIT logging to work on my ALL-CentOS network environment. I have 6 workstations dt1...dt6, and the log files are not generated at all and I cannot tell if the messages are being sent from these workstations: dt1..dt6 over to the server (srv1). I have configured the rsyslog.conf on the workstations as shown in the link: Sending audit logs to SYSLOG server, and add the additional touches for generating the logfiles into a separate directory per YEAR/MONTH/DAY (using proper syntax) and into separate HOSTNAME-based_audit.log files. Note: RSYSLOG messaging does appear to work from the workstations over to the server, but the audit logging portion is not working. I am running CentOS-6.5 with RPMs: audit-2.2-4.el6_5.x86_64, audit-libs-2.2-4.el6_5.x86_64, and rsyslog-5.8.10-8.el6.x86_64 I have gotten zero responses from wiki.rsyslog.com and really need this to work. If needed I can send files of one of my workstations and the server to aid in the process. Thanks, Warron

    Read the article

  • Do large folder sizes slow down IO performance?

    - by Aaron
    We have a Linux server process that writes a few thousand files to a directory, deletes the files, and then writes a few thousand more files to the same directory without deleting the directory. What I'm starting to see is that the process doing the writing is getting slower and slower. My question is this: The directory size of the folder has grown from 4096 to over 200000 as see by this output of ls -l. root@ad57rs0b# ls -l 15000PN5AIA3I6_B total 232 drwxr-xr-x 2 chef chef 233472 May 30 21:35 barcodes On ext3, can these large directory sizes slow down performance? Thanks. Aaron

    Read the article

  • How to get filename of job in cups?

    - by Grook
    I have printed a couple of files and lpstat shows that they are completed. But the output is something like this: # lpstat -W completed -l Canon-1 root 1086464 Sat May 21 22:47:03 2011 Alerts: job-canceled-by-user queued for Canon Canon-2 root 337920 Mon May 23 20:18:02 2011 Alerts: job-canceled-by-user queued for Canon CanonWin-3 root 17408 Mon May 23 20:29:40 2011 Alerts: job-completed-successfully queued for CanonWin` How can i get names of files which has been printed? P.S. Is there is any bash-script which allows me to get names of all files which has been printed?

    Read the article

  • Move some iTunes library items to different drive?

    - by Sören Kuklau
    My internal hard drive is somewhat small, and I only regularly listen to a fraction of my iTunes library anyway, so I'd like to keep large portions on it on an external drive for archival purposes. Since dealing with multiple iTunes libraries is somewhat painful, the solution I'm looking for is to move individual items of the library to a different location, without compromising the "Keep organized" and "Copy files" settings. I found an AppleScript that I assume is supposed to do this, Move Files To Folder…, but it instead copies them, and doesn't update the library accordingly. I can do this manually by moving the file, then accessing it in iTunes — it'll prompt me for the new location. I just don't intend to do this one by one for thousands of files.

    Read the article

  • Is it possible to create an SFTP drop box?

    - by Jordan Reiter
    I have a Windows server with folders accessible via SFTP (server is running OpenSSH). scp is blocked. I would like to copy files from a Linux server to the Windows server. SFTP seems like a good option. Ideally I'd like something similar to an FTP drop box, so that the Linux box could just copy files directly over to the Windows box. I'm also open to any solutions to this that would allow me to copy the files while offering the least amount of hassle. The language I'd be using on the Linux box is python; not sure if that factors in or not.

    Read the article

  • A desktop Wiki editor/viewer: is there anything out there?

    - by MrBertie
    I'm a big user of wikis, mainly Dokuwiki, I really like the clarity and ease of use of simple text files. However all good wikis seem to require a web-server of some kind; has anyone come across a good desktop wiki editor/viewer that work with plain-text files, and allow me to work with wiki text files just like any other document file type (note: not a desktop wiki running inside a local webserver) Before you rush to suggest (I hope!) I have done months of research on this and have tried Wixi, Wikidpad, zulupad.... Any ideas anyone?

    Read the article

  • rsync remote to local automatic backup

    - by Mark Molina
    Because all my work is stored on a remote server I would like to auto backup my server monthly and weekly. My server is running Centos 5.5 and while searching the web I'm found a tool named rsync. I got my first update manually by using this command in terminal: sudo rsync -chavzP --stats USERNAME@IPADDRES: PATH_TO_BACKUP LOCAL_PATH_TO_BACKUP I then prompt my password for that user and bob's my uncle. This backups the necessary files from my remote server to my local device but does somebody know how I can automate this? Like automatic running this script every sunday? EDIT I forgot to mention that I let direct admin backup the files I need and then copy those files from the remote server to a local server.

    Read the article

  • Puzzled about PHP file permission and shared webhosting - what are some explanations?

    - by extrakun
    I have this issue with different web-hosting, particular upload scripts which can only upload to a folder only if it has 777 permission (which is risky). On the test server (on a different webhost), 755 works well. On another web-hosting, log files generated by PHP file functions cannot be write to some time, but other files are mysteriously unaffected (for instance, the log files for the entire week is 655, and they work well, but just today's log-file doesn't work unless it is set to 777). I am more of an application developer than a server backend expert, so these behaviours puzzle me to no end. Why are they happening? What can be done?

    Read the article

  • Directory comparison in Meld but ignoring changes that only involve file timestamp?

    - by creamcheese
    I'm using Meld to compare two directories of source code on Ubuntu. However, because all of the files in one of the directories have been 'touched' so that all of their timestamps were updated, Meld is showing them as different, even though the contents of the files have not changed. But I'm only trying to find files that have different content. I don't see an option to get Meld just to look at changed contents. Any ideas for how to do this in Meld or is there a better GUI directory comparison tool for Ubuntu?

    Read the article

  • Windows 7 Ult machine can see XP Pro but not vice versa

    - by Chadworthington
    My new Windows 7 Ult. PC can attach to my work laptop and pull files but my work laptop cannot find my Windows 7 on the network. Is there some sort of limitation going on here? Before I got the new XP machine, my old XP Pro PC could pull files from the XP Pro laptop but not vice versa. The common thread seems to be that the work laptop cannot see other PCs, Windows 7 or not. Could it be because that PC is on a work domain? When I pull files from the work PC, I am prompted for domain credentials, which I provide.

    Read the article

  • How to force rsync to use destination directory as root

    - by thepurplepixel
    I have a simple script to one-way-sync files/folders within a directory: #!/bin/bash HOST='<hostname>' USER='<username>' DIR='/downloads/' SOURCE='/srv/torrents' rsync -e "ssh -l $USER" --remove-source-files -h -4 -r --stats --progress -i $SOURCE $HOST:$DIR find $SOURCE -type d -empty -prune -exec rmdir -p \{\} \; However, when this rsync operation runs, it creates a folder, torrents in /downloads on the destination machine. How can I force rsync to put all folders & files from /srv/torrents (remote) into /downloads/ (local) instead of creating /downloads/torrents as a separate directory?

    Read the article

  • nginx: Disallow Acces to a Folder, except some subfolders

    - by user68202
    how it is possible to deny access to a folder, but execept some subfolders in it from "deny"? I tried something like this (in this order): #this subfolder shouldnt be denied and php scripts inside should be executable location ~ /data/public { allow all; } #this folder contains many subfolders that should be denied from public access location ~ /data { deny all; return 404; } ... which doesnt work correctly. Files inside the /data/public folder are accessible (all other in /data are denied as it should be), but PHP files are not executed anymore in the /data/public folder (if i dont add these restrictions, the php files are executable). What is wrong? How can it be correct? I think theres a better way to do it. It would be very nice if anyone can help me with this :).

    Read the article

  • Bitlocker folder encryption

    - by Razor
    My situation I know that bitlocker is meant to encrypt whole drives, but I have an hard drive that is already fully partitioned and containing data. I'd like to encrypt part of one partition, leaving the rest of the partition accessible. I would very much like to avoid programs like Norton partition magic (which resize/split partitions), because every time I used them I had problems with the data stored. Question Is there any way/builtin alternative/3rd party app that integrates with windows login to encrypt one subset of a partition? EDIT I heard horror stories about EFS, which is why I don't want to use it. Some highlights from that article: In fact I’ve only used EFS twice in the last ten years on my own computers and on both occasions I’ve lost files and documents. I therefore cannot recommend you ever encrypt your files with this Windows feature. Unfortunately, because of incompatibilities with some differing versions of EFS files can end up scrambled and unrecoverable.

    Read the article

  • How can I organize movies, with IMDB import and fields for tags?

    - by fluxtendu
    How can I organize movies on Windows? Features wanted: IMDB import fields: director, genre, year, actors,... covers manage DVD / BD / ripped AVIs, MKV moving/renaming files according fields I want to achieve something like: M:\Movies\Director\year. title (genre).avi A syntax that lets me choose how I organize my files (like foobar2000 & mp3 files) according IMDB and personal tags would be great. Auto fetch covers and listing of not-yet-ripped/yet-to-be-seen/wanted/loaned movies (txt, HTML or whatever) would be some nice features too.

    Read the article

  • Add entire 300 GB filesystem to Git Annex repository?

    - by Ryan Lester
    By default, I get an error that I have too many open files from the process. If I lift the limit manually, I get an error that I'm out of memory. For whatever reason, it seems that Git Annex in its current state is not optimised for this sort of task (adding thousands of files to a repository at once). As a possible solution, my next thought was to do something like: cd / find . -type d | git annex add --$NONRECURSIVELY find . -type f | git annex add # Need to add parent directories of each file first or adding files fails The problem with this solution is that there doesn't seem from the documentation to be a way to non-recursively add a directory in Git Annex. Is there something I'm missing or a workaround for this? If my proposed solution is a dead end, are there other ways that people have solved this problem?

    Read the article

  • Is there a way to tell if a file is done copying?

    - by Mike Cooper
    The scenario is this: Machine A has files I want to copy to Machine C. Machine A can't access C directly, but can access Machine B that can access Machine C. I am using scp to copy from Machine A to B, and then from B to C. Machine B has limited storage space, so as files come in, I need to copy them to C and delete them from B. The second copy is much faster, so this is no problem with bandwidth. I could do this by hand, but I am lazy. What I would like is to run a script on B or C that will copy each file to C as each one finishes. The scp job is running from A. So what I need is a way to ask (preferably from a bash script) if file X.avi is "done" copying. Each of these files is a different size, and I can't really predict size or time of completion. Edit: by the way, the file transfer times are something about 1 hour from A to B and about 10 minutes from B to C, if time scale matters at all.

    Read the article

  • Create a wireless Network in Ubuntu 10.04?

    - by Vinaychalluru
    Hello., Whenever i need to copy some files and couldnt find a pen drive, [ i dont have bluetooth enabled in my lap], I create a new ad hoc (wireless) network in windows and share the necessary files on the network and share files with the other system. Now, i want to do the same with Ubuntu. But, i couldn't. I could create a new network, but i am not able to chare or send or even dont know how to access the system connected to the network. So, how can i do this? Thanks

    Read the article

  • MS Windows issue - "Filename or extension is too long"

    - by Daniel
    I run Microsoft windows on a few of my machines. I don't know if many people know about this issue in the OS but you can't have very long filenames, from what I know Linux can have longer names, I have never run into this issue on my Linux machines. Anyway I run into issues whenever copying folders & files to backup drives. I manually backup of my data, finding and changing names of files, this is very very tedious. Is there a software tool to shorten folders or filenames that are found to be to long on Windows? I have drive image duplication software which does the job but in a way that I don't like, plus moving files can become a hassle at times if the names are too long to copy.

    Read the article

  • Windows folder encryption

    - by Razor
    My situation I know that bitlocker is meant to encrypt whole drives, but I have an hard drive that is already fully partitioned and containing data. I'd like to encrypt part of one partition, leaving the rest of the partition accessible. I would very much like to avoid programs like Norton partition magic (which resize/split partitions), because every time I used them I had problems with the data stored. Question Is there any way/builtin alternative/3rd party app that integrates with windows login to encrypt one subset of a partition? EDIT I heard horror stories about EFS, which is why I don't want to use it, unless there have been improvements on reliability with windows 8. Some highlights from that article: In fact I’ve only used EFS twice in the last ten years on my own computers and on both occasions I’ve lost files and documents. I therefore cannot recommend you ever encrypt your files with this Windows feature. Unfortunately, because of incompatibilities with some differing versions of EFS files can end up scrambled and unrecoverable.

    Read the article

  • tar incremental backup is backing everything up, every time

    - by Cyclic
    I made an incremental backup about 10 months ago (on Jan 27, 2013), creating a .snar metadata file. Now, when I try to make an incremental backup using tar --create --file=dropbox_incremental_1.tar --listed-incremental=dropbox_0.snar Dropbox the command just re-backs up everything. I'm not an expert at Unix timestamps, but I noticed that virtually all of my directory timestamps are way more recent than the last time they changed. For my actual files, they look like this: Access: 2013-03-12 19:04:51.000000000 -0500 Modify: 2012-09-30 15:10:47.000000000 -0500 Change: 2013-03-12 19:04:51.306209672 -0500 The 'Modify' timestamp seems correct, but the files were definitely not changed (at least not doing anything that I know of) at the time they say they were. These files still seem to go into the incremental archive. What's happening here? Is there a way to tell tar to look at the 'modify' timestamp? Isn't this what it's supposed to be doing?

    Read the article

  • vsftpd, virtual users and permissions. Avoid using chmod 777?

    - by Jakobud
    I am running vsftpd with Virtual Users (managed through a MySQL db). Each users home/default directory is owned by vsftpd:vsftpd. I need to give a user read/write permissions to some website files, owned by apache:apache so they can make some changes. I did a bind mount for the web directory to a directory in the ftp user's home/default directory. When logging in, the user is not able to write to the web folder, unless I set files to 777. Is it possible to set this up with making the directory and it's files 777? The web directory needs to be apache:apache in order for apache to work with it.

    Read the article

< Previous Page | 455 456 457 458 459 460 461 462 463 464 465 466  | Next Page >