Search Results

Search found 37883 results on 1516 pages for 'sparse files'.

Page 256/1516 | < Previous Page | 252 253 254 255 256 257 258 259 260 261 262 263  | Next Page >

  • what config files need to be transferred while migrating apache vhosts from old suse server to new suse server?

    - by jarus
    I have an old server with suse on it and its hosting numerous website under same IP , now i am trying to migrate the websites and all the contents of the old suse server to a new server with open suse 12.1 , i have transferred "/srv/www/vhosts" "/etc/apache2/vhosts.d" "/etc/apache2//httpd.conf" "/etc/apache2/listen.conf" "/etc/apache2/default-server.conf" i have transferred all the database files also . i am trying to replace the old server with the new server , i tried changing the ip address with the old server's ip address but its not working. what files do i need to transfer and what do i need to do to get the new server hosting the websites in place of the old server , please, any help will be greatly appreciated.

    Read the article

  • 7zip: Add files to new folder in archive via command line?

    - by cschol
    I am using 7zip for compressing a bunch of files. The files are in a directory structure, like this: MyDir\File1 MyDir\File2 MyDir\File3 MyDir\MoreFiles\File4 MyDir\MoreFiles\File5 I want to create a 7z file with the following structure via command line: ZippedDir\File1 ZippedDir\File2 ZippedDir\File3 ZippedDir\MoreFiles\File4 ZippedDir\MoreFiles\File5 Basically, I want to zip the content of MyDir\ into a new folder called ZippedDir\. I know I could copy the content into a directory called ZippedDir\ and then zip this new directory. However, I was wondering if there was a way to avoid this extra copy step and directly zip the content, if possible, via command line.

    Read the article

  • How to securely delete files stored on a SSD?

    - by Chris Neuroth
    From a (very long, but definitely worth to read) article on SSDs: When you delete a file in your OS, there is no reaction from either a hard drive or SSD. It isn’t until you overwrite the sector (on a hard drive) or page (on a SSD) that you actually lose the data. File recovery programs use this property to their advantage and that’s how they help you recover deleted files. The key distinction between HDDs and SSDs however is what happens when you overwrite a file. While a HDD can simply write the new data to the same sector, a SSD will allocate a new (or previously used) page for the overwritten data. The page that contains the now invalid data will simply be marked as invalid and at some point it’ll get erased. So, what would be the best way to securely erase files stored on a SSD? Overwriting with random data as we are used to from hard disks (e.g. using the "shred" utility) won't work unless you overwrite the WHOLE drive...

    Read the article

  • Excel; exporting/importing different columns to different csv files

    - by Sisyphus
    Is there a way to batch export different columns to different csv files in excel on an OSX, I'm thinking something along the lines of possibly automator, applescript or bash. I've had a look play around with automator and so far no look. The best I have accomplished export the whole sheet, then use sed to strip out what I don't need, however this is terribly inefficient. Also, is there a method, to batch import multiple csv files into columns. Thanks in advance && sorry I didn't tag excel correctly it wouldn't allow me to create the excel:mac tag

    Read the article

  • When running a shell script, how can you protect it from overwriting or truncating files?

    - by Joseph Garvin
    If while an application is running one of the shared libraries it uses is written to or truncated, then the application will crash. Moving the file or removing it wholesale with 'rm' will not cause a crash, because the OS (Solaris in this case but I assume this is true on Linux and other *nix as well) is smart enough to not delete the inode associated with the file while any process has it open. I have a shell script that performs installation of shared libraries. Sometimes, it may be used to reinstall versions of shared libraries that were already installed, without an uninstall first. Because applications may be using the already installed shared libraries, it's important the the script is smart enough to rm the files or move them out of the way (e.g. to a 'deleted' folder that cron could empty at a time when we know no applications will be running) before installing the new ones so that they're not overwritten or truncated. Unfortunately, recently an application crashed just after an install. Coincidence? It's difficult to tell. The real solution here is to switch over to a more robust installation method than an old gigantic shell script, but it'd be nice to have some extra protection until the switch is made. Is there any way to wrap a shell script to protect it from overwriting or truncating files (and ideally failing loudly), but still allowing them to be moved or rm'd? Standard UNIX file permissions won't do the trick because you can't distinguish moving/removing from overwriting/truncating. Aliases could work but I'm not sure what entirety of commands need to be aliased. I imagine something like truss/strace except before each action it checks against a filter whether to actually do it. I don't need a perfect solution that would work even against an intentionally malicious script. Ideas I have so far: Alias cp to GNU cp (not the default since I'm on Solaris) and use the --remove-destination option. Alias install to GNU install and use the --backup option. It might be smart enough to move the existing file to the backup file name rather than making a copy, thus preserving the inode. "set noclobber" in ~/.bashrc so that I/O redirection won't overwrite files

    Read the article

  • Any way to know what files were in a broken ZFS pool?

    - by Erik Tjernlund
    I have a large ZFS pool of 4 combined drives. Now, the filesystem can not be mounted: pool: tank state: UNAVAIL status: One or more devices could not be opened. There are insufficient replicas for the pool to continue functioning. action: Attach the missing device and online it using 'zpool online'. see: http://www.sun.com/msg/ZFS-8000-3C scan: none requested config: NAME STATE READ WRITE CKSUM tank UNAVAIL 0 0 0 insufficient replicas c10t0d0 ONLINE 0 0 0 c8t0d0 UNAVAIL 0 0 0 cannot open c8t1d0 ONLINE 0 0 0 c10t1d0 ONLINE 0 0 0 Probably a broken drive (c8t0d0). I'm not overly concerned by the loss of the data, but I'd love to know exactly which files were in that pool. Is there any way to get a listing of what files were there?

    Read the article

  • How to Merge Data From Multiple Excel Files into a Single Excel File or Access Database?

    - by lalabeans
    I have a few dozen excel files which are all of the same format (i.e. 4 worksheets per Excel file). I need to combine all the files into 1 master file which must have just 2 of the 4 worksheets. The corresponding worksheets from each Excel file are named exactly the same as are the column headers. While each file is structured the same, the information within sheet 1 and 2 (for example) is different. So it can’t be combined into one file with everything in one sheet! I've never used VBA before and I'm wondering where I might start this task!

    Read the article

  • How can i automatically move files based on their name?

    - by Pasha
    I have 13 folders containing scanned photographs. Some photographs have been renamed to the date on which they were taken, resulting in YYYY.MM.DD.tif name. It could potentially be YYYY.MM.DD (###).tif where ### is just a number. Others are just named IMG_###.tif I would like to move the files with the YYYY.MM.DD name to a YYYY\MM\DD folder structure. While the files are being moved, I would also like to append the original folder name to the end of the file name. So, a file 01\2012.06.26 (1).tif should end up 2012\06\26\2012.06.26 (1) - 01.tif Is there a Windows tool that can help me with this? Or do I need to resort to writing a custom app?

    Read the article

  • What is the best free service to host images and mp3 files?

    - by Edward Tanguay
    I am making an educational social software silverlight application. I would like users to be able to point the application to a URL with text, images, and audio files which they have created. Many users will not have their own website to do this, so we are looking for a free service they can use to upload, and manage their own text/image/audio content. What is the best free service for non-technical users to upload and make available text, images and audio? For instance, sites.google.com allows you to upload pictures and access them via http so that would work, but that is more about making a website. For this purpose we just need the ability to upload files, without the website creation tools.

    Read the article

  • Ubuntu how to FTP transfer files to folder /var/www?

    - by jc.yin
    I'm new to linux and I've set up a web server with Ubuntu Desktop edition so I can practice with the GUI a bit before transitioning to Ubuntu Server. I've already set up a LAMP stack as well as FTP. Now I just need to know how to transfer my web files to the /var/www folder in Ubuntu. Previously I've worked on Mac OS and there's a central server for all the web files where I can FTP to. Now after I've managed to connect via FTP to the Ubuntu server, I see all the folders such as Desktop, Downloads, Documents etc but no web folder. Anyone able to help me understand how do I FTP to the /var/www folder in Ubuntu? Thanks

    Read the article

  • Why does cpio say "WARNING! These file names were not selected" when copying a large number of files

    - by mmm bacon
    For over 10 years, I've been using this strategy to copy a large number of files between UNIX filesystems: cd source_directory find . -depth -print | cpio -pdm /path/to/destination_directory It works like a champ. However, I'm now getting this error from cpio: cpio: WARNING! These file names were not selected: (long list of files here...) The source directory is on OSX 10.5, and the destination directory is a NFS filesystem from an OpenSolaris server. Copying over NFS has never been a problem in the past. There's nothing strange about the filenames, meaning there aren't special characters or anything like that. Any ideas?

    Read the article

  • Adobe Acrobat: How to batch to combine multiple pdf files?

    - by Andrei Andre
    I have 3 folders: Folder 1 Folder 2 Folder 3 In each folder I have 5 pdf files: Folder 1 file1.pdf file2.pdf Folder 2 file1.pdf file2.pdf Folder 3 file1.pdf file2.pdf I want that in each folder to have a combined file of those two files: Folder 1 binder.pdf Folder 2 binder.pdf Folder 3 binder.pdf Any idea? Don't tell to do it manually. This case is just to explain you my problem. Think that I have hundreds of folders. :) Maybe I can use another tool instead of Adobe Acrobat?!

    Read the article

  • How do you change the "scan this dir for additional ini files" path?

    - by amvx
    I managed to get the custom INI to load, but its still loading other .ini files from the default location. I created an fcgi wrapper that passed the ini value as a parameter. That worked. Now just these other ini's need to be loaded from the same dir as my custom ini. The problem is the other .ini files are overriding the settings in my custom php.ini =/ I realize the problem now is that the php.fcgi was compiled with a custom path parameter. So that's a problem. I might have to recompile it using a different location or none at all. I'd hate to have to compile an fcgi for each domain =/

    Read the article

  • List/remove files, with filenames containing string that's "more than a month ago"?

    - by Martin Tóth
    I store some data in files which follow this naming convention: /interesting/data/filename-YYYY-MM-DD-HH-MM How do I look for the ones with date in file name < now - 1 month and delete them? Files may have changed since they were created, so searching according to last modification date is not good. What I'm doing now, is filter-ing them in python: prefix = '/interesting/data/filename-' import commands names = commands.getoutput('ls {0}*'.format(prefix)).splitlines() from datetime import datetime, timedelta all_files = map(lambda name: { 'name': name, 'date': datetime.strptime(name, '{0}%Y-%m-%d-%H-%M'.format(prefix)) }, names) month = datetime.now() - timedelta(days = 30) to_delete = filter(lambda item: item['date'] < month, all_files) import os map(os.remove, to_delete) Is there a (oneliner) bash solution for this?

    Read the article

  • Can I lose files when changing security on an XP drive within Windows 7?

    - by Will
    Hard to come up with a title for this one, sheesh. Have a friend whose computer went down. He asked me to get all his data off his drive. His old computer was running XP. So, I've plugged it into my Windows 7 computer. When I attempt to open up his Documents and Settings folder, I get prompted to elevate in order to "permanently get access to this folder." If I do this, will I be able to access the files in this directory, or will all the current files be lost? I may be overly paranoid about this, but I can't find any information about exactly what will happen when I do this. TIA.

    Read the article

  • How do I recover files from a corrupt VDI file?

    - by Eric P
    Is it possible to repair a corrupt VDI file? The OS on the VDI (XP) doesn't boot at all, it just hangs at a black screen. I was getting file errors before on its last boot, but now its not working at all. Sector viewer shows 'Invalid partition table Error loading operating system Missing operating system'. I tried mounting the file from the host OS, but it just says that the drive isn't formatted. I don't need to be able to run the VDI, but I do need some files that are on it. Is there any way to recover files from the corrupt VDI file?

    Read the article

  • How do I extract files from one tarball to another tarball in one step?

    - by Martin
    I have some fairly large tarball archives, from which I need to extract some files. I will later repack those files to transfer them to another server. Currently that is a two (multi) step process for me: mkdir ttmp tar -vxzf large.tgz -C ttmp/ --strip-components=<INT> <folder-to-be-extracted> or alternatively with wildcards mkdir ttmp tar -vxzf large.tgz -C ttmp/ --strip-components=<INT> \ --wildcards --no-anchored '*pattern*' Then I go ahead and recompress the created folder tar -vczf small.tgz ttmp/* rm -rf ttmp How can I combine these two commands into one? Like this tar -x large.tgz > tar -c small.tgz Just to show, what I already tried: Whenever I search the terms "extract" I will end up here or here or even here. When I use the term "split" I will end up here and that is definitely not what I intend to do. When I use "repack" I end up in strange places.

    Read the article

  • Why does the command forfiles list files that are not a day old despite the command being otherwise?

    - by PeanutsMonkey
    The command I am executing is forfiles -p"C:\testdata" -m*.* -d-1 -c"cmd /c echo @PATH\@FILE" I have specified that I wish to only list files that are a day old however when I execute the statement, it returns me a list of files that were created today. Why is that? Am I doing something wrong? Would it be better to specify a time period as opposed to a date e.g. 24 hours? The version of forfiles I have reads as follows FORFILES v 1.1 - [email protected] - 4/98. The batch file is being run on Windows XP.

    Read the article

  • What's the fastest way to store/access large files?

    - by philfreo
    I do a lot of video editing on my Mac and need a way to store very large (30 GB) files, and don't have room on my HD. A USB/Firewire external hard drive would work, but it seems way too slow for consistently working with such large files. I've also considered buying another computer, with a large hard drive, and putting it on the same network with a shared folder. What's the fastest / most efficient way to do this? Please consider USB 2.0 speeds, hard drive read times, ethernet speeds, etc. Are there other options I should consider?

    Read the article

  • How can I share files from my Windows 7 machine to my friend's Ubuntu machine?

    - by ProfKaos
    I run a Windows 7 Pro SP1 laptop as my home machine, and my housemate runs an Ubuntu 12.04.1.05 desktop. We share a WLAN. I would like to make certain locations and files available for him to read and maybe write. How can I go about this? Bearing in mind I have very little recent experience with modern Linux, and Ubuntu in particular. My first idea is to share a Windows folder with my Ubuntu VM under VMWare Player, then his Ubuntu machine can connect to my Unbuntu VM, and the two can use whatever magic Ubuntu uses to achieve file sharing. This requires my Ubuntu VM to be always running though, and that may not always be possible. I have also heard that Samba may have a feature to help here, but I know nothing about that. How can I share my Windows files with my mate's Ubuntu machine, preferably with a 1 to 1 connection, i.e. rather not using shim VM's

    Read the article

  • How do I catalog files on several external hard drives that I want to store off-line? OSX

    - by raudi
    My partner, an artist, has more than 10 external hardisks both USB and firewire and every 2-3 months a new one has to be added (She's working with videos and pictures) currently its 10TB and growing so too much for a affordable NAS. Right now the files are not indexed and I think can not be searched with spotlight because not all drives can be connected at the same time. So if she wants to search for a file, she has to guess which disk/disks (based mostly on the date) and then search several drives. Now I'm looking for a solution to index/catalog the drives, something like GentibusCD Cathy Disclib (all these solutions are unfortunately Windows only) Is there any software for OSX that will catalog all the hard drives, so she can search the catalog, find the files, and get the ID of the disk / disk name that has the content? Preferably something with a GUI so my partner can also use it easily Preferably with Thumbnails for pictures/videos (But even an equivalent to "tree /F /A" would be better than nothing)

    Read the article

  • How to print TIFF files using MSFT Office Document imaging?

    - by Think Floyd
    OS: Vista and Windows7 I have Microsoft Office Document Imaging installed. .tif and .tiff files association is set to " Microsoft Office Document Imaging" When I open a TIFF file, it opens in " Microsoft Office Document Imaging". Good so far. However, when I right-click on the TIFF file and invoke print, I see a "Print Pictures" dialog, ("How do you want to print your pictures?") I have some applications installed on my machine that print incoming TIFF files on the printer. They work fine on XP. However, on Vista and Windows7, I get this "Print pictures" prompt requiring an user intervention (i.e, click on Print button). How do I get rid of this "Print Pictures" prompt?

    Read the article

  • How do I set the umask for files and directories created from the GUI in MacOS X Lion (10.7)?

    - by Avry
    I've set my umask in my .bashrc file to 007. Any files created on the command line after loading my bashrc file respects this setting. I want to be able to set the umask to 007 for any files created using non-command line apps. This document talks about setting the umask via launchd. And it kind of works. If I follow these directions I can change the default permissions on a GUI created file from rw-r--r-- to rw-rw---- but the directories still are not group writeable (i.e. I want them to be rwxrwx--- but they are rwxr-x--- instead) The analog on Linux would be /etc/login.defs as the place to set the umask. What do I change in order for the umask to be set properly (i.e. the way I want it)?

    Read the article

< Previous Page | 252 253 254 255 256 257 258 259 260 261 262 263  | Next Page >