Search Results

Search found 49518 results on 1981 pages for 'configuration files'.

Page 342/1981 | < Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >

  • Is there an easy way to mass-transfer all files between two computers?

    - by Raven Dreamer
    I'm currently visiting my parents for the Christmas holidays, and as I'm sure is the case of many in the post-tech generation, my arrival brings with it the assumption of personal tech support. I've been tasked with setting up my folks' new computer, and they want to make sure all their files, yes, all their files, get transfered over to the new device. Short of manually dragging each folder in the C:/ directory onto an external hard drive and then out onto the recipient computer, is there an easier / faster way to do this?

    Read the article

  • Windows Server 2008 R2 DFS Replication - Which files are replicating?

    - by caleban
    Windows Server 2008 R2 DFS Replication Is it possible to see which specific files are replicating in real time using a GUI or command line tool or in a log somewhere? I didn't see this in the DFS health reporting, in the DFS event viewer, or the DFS log. The log is pretty cryptic though so it may be in there and I'm unable to see it. I searched through the DFS log for paths and files I know should be replicating and they're not in there.

    Read the article

  • How to copy with cp to include hidden files and hidden directories and their contents?

    - by eleven81
    How can I make cp -r copy absolutely all of the files and directories in a directory Requirements: Include hidden files and hidden directories. Be one single command with an flag to include the above. Not need to rely on pattern matching at all. My ugly, but working, hack is: cp -r /etc/skel/* /home/user cp -r /etc/skel/.[^.]* /home/user How can I do this all in one command without the pattern matching? What flag do I need to use?

    Read the article

  • What is the best way to remove duplicate files on web hosting's FTP server?

    - by Eric Harrison
    For some reason(Happened before I started working on this project)- my client's website has 2 duplicates of every single file. Effectively tripling the size of the site. The files look much like this: wp-comments-post.php | 3,982 bytes wp-comments-post (john smith's conflicted copy 2012-01-12).php | 3,982 bytes wp-comments-post (JohnSmith's conflicted copy 2012-01-14).php | 3,982 bytes The hosting that the website is on has no access to bash or SSH. In your opinion, what would be the easiest way to delete these duplicate files that would take the least time?

    Read the article

  • Bash Shell Scripting - How to iterate through directories, and copy and rename files?

    - by Cypher
    I have a directory setup as follows: /hosted/partner1/logo.png /hosted/partner2/logo.png /hosted/partner3/logo.png /hosted/partner4/logo.png /hosted/partner5/logo.png ..etc. I want to write a script that can COPY those files to a different location, with a different file name, like this: /partners/partner1.png /partners/partner2.png /partners/partner3.png ..etc. Any ideas? I'm not so great with shell scripting and there are a lot of files that I need to migrate to a single directory...

    Read the article

  • Which Files located under C:\ are Necessary for Win7 to Boot?

    - by k0pernikus
    I had my greatest moment of incredible stupidity and deleted all hidden files of the Windows partition, most commonly known as C:\, while running Gnu/Linux. All the directories are intact. I instantly unmounted it, and run ntfsundelete, though of the thousands entries I wonder which ones I have to recover. So hence my question: Which files located directly under C:\ are necessary for Windows 7 to boot?

    Read the article

  • Is it possible to copy a set of files, but automatically skip if file already exists?

    - by awe
    I know that the copy command has an option to automatically replace a file if it already exists, but I want to know if it is a way to copy the files only if they not already exist (/Y). I do not know the actual file names in the batch code, as I copy from the source using wildcards in the copy command: copy *.zip c:\destination The reason I want this instead of automatic overwrite is that the files are large, and to skip existing would save a lot of execution time.

    Read the article

  • Have a process which runs nightly to automatically zip old files?

    - by esac
    I have a file share, and I want a process which enumerates files on that share and automatically creates a 7z self-extracting exe of files over 1 month old. On a different share, I want to create a 7z self-extracting exe of directories that are over 1 month old. Any idea if there is a program which can do this? I already have 7z a -t7z -mx9 -sfx filename.exe filename.txt Portion of it, just need more of the auto-management portion.

    Read the article

  • Why does Red5 puke over h.264 mpeg files?

    - by KarateCowboy
    I have a new installation of Red5 on Ubuntu linux. We are using JW Player for clientside. We can pause and skip ahead when streaming FLV files. However when using mp4 files if you pause and then press play again it starts playback at the beginning. If I try to skip ahead it just plays from the beginning.

    Read the article

  • Why does Red5 puke over h.264 mpeg files?

    - by KarateCowboy
    I have a new installation of Red5 on Ubuntu linux. We are using JW Player for clientside. We can pause and skip ahead when streaming FLV files. However when using mp4 files if you pause and then press play again it starts playback at the beginning. If I try to skip ahead it just plays from the beginning.

    Read the article

  • Why does Red5 puke over h.264 mpeg files?

    - by KarateCowboy
    I have a new installation of Red5 on Ubuntu linux. We are using JW Player for clientside. We can pause and skip ahead when streaming FLV files. However when using mp4 files if you pause and then press play again it starts playback at the beginning. If I try to skip ahead it just plays from the beginning.

    Read the article

  • Why does Red5 puke over h.264 mpeg files?

    - by KarateCowboy
    I have a new installation of Red5 on Ubuntu linux. We are using JW Player for clientside. We can pause and skip ahead when streaming FLV files. However when using mp4 files if you pause and then press play again it starts playback at the beginning. If I try to skip ahead it just plays from the beginning.

    Read the article

  • How should we serve files in a small bioinformatics cluster?

    - by cespinoza
    We have a small cluster of six ubuntu servers. We run bioinformatics analyses on these clusters. Each analysis takes about 24 hours to complete, each core i7 server can handle 2 at a time, takes as input about 5GB data and outputs about 10-25GB of data. We run dozens of these a week. The software is a hodgepodge of custom perl scripts and 3rd party sequence alignment software written in C/C++. Currently, files are served from two of the compute nodes (yes, we're using compute nodes as file servers)-- each node has 5 1TB sata drives mounted separately (no raid) and is pooled via glusterfs 2.0.1. They each have as 3 bonded intel ethernet pci gigabit ethernet cards, attached to a d-link DGS-1224T switch ($300 24 port consumer-level). We are not currently using jumbo frames (not sure why, actually). The two file-serving compute nodes are then mirrored via glusterfs. Each of the four other nodes mounts the files via glusterfs. The files are all large (4gb+), and are stored as bare files (no database/etc) if that matters. As you can imagine, this is a bit of a mess that grew organically without forethought and we want to improve it now that we're running out of space. Our analyses are I/O intensive and it is a bottle neck-- we're only getting 140mB/sec between the two fileservers, maybe 50mb/sec from the clients (which only have single NICs). We have a flexible budget which I can probably get up $5k or so. How should we spend our budget? We need at least 10TB of storage fast enough to serve all nodes. How fast/big does the cpu/memory of such a file server have to be? Should we use NFS, ATA over Ethernet, iSCSI, Glusterfs, or something else? Should we buy two or more servers and create some sort of storage cluster, or is 1 server enough for such a small number of nodes? Should we invest in faster NICs (say, PCI-express cards with multiple connectors)? The switch? Should we use raid, if so, hardware or software? and which raid (5, 6, 10, etc)? Any ideas appreciated. We're biologists, not IT gurus.

    Read the article

< Previous Page | 338 339 340 341 342 343 344 345 346 347 348 349  | Next Page >