Search Results

Search found 54055 results on 2163 pages for 'multiple files'.

Page 487/2163 | < Previous Page | 483 484 485 486 487 488 489 490 491 492 493 494  | Next Page >

  • How do you change the "scan this dir for additional ini files" path?

    - by amvx
    I managed to get the custom INI to load, but its still loading other .ini files from the default location. I created an fcgi wrapper that passed the ini value as a parameter. That worked. Now just these other ini's need to be loaded from the same dir as my custom ini. The problem is the other .ini files are overriding the settings in my custom php.ini =/ I realize the problem now is that the php.fcgi was compiled with a custom path parameter. So that's a problem. I might have to recompile it using a different location or none at all. I'd hate to have to compile an fcgi for each domain =/

    Read the article

  • Ubuntu how to FTP transfer files to folder /var/www?

    - by jc.yin
    I'm new to linux and I've set up a web server with Ubuntu Desktop edition so I can practice with the GUI a bit before transitioning to Ubuntu Server. I've already set up a LAMP stack as well as FTP. Now I just need to know how to transfer my web files to the /var/www folder in Ubuntu. Previously I've worked on Mac OS and there's a central server for all the web files where I can FTP to. Now after I've managed to connect via FTP to the Ubuntu server, I see all the folders such as Desktop, Downloads, Documents etc but no web folder. Anyone able to help me understand how do I FTP to the /var/www folder in Ubuntu? Thanks

    Read the article

  • Can I lose files when changing security on an XP drive within Windows 7?

    - by Will
    Hard to come up with a title for this one, sheesh. Have a friend whose computer went down. He asked me to get all his data off his drive. His old computer was running XP. So, I've plugged it into my Windows 7 computer. When I attempt to open up his Documents and Settings folder, I get prompted to elevate in order to "permanently get access to this folder." If I do this, will I be able to access the files in this directory, or will all the current files be lost? I may be overly paranoid about this, but I can't find any information about exactly what will happen when I do this. TIA.

    Read the article

  • List/remove files, with filenames containing string that's "more than a month ago"?

    - by Martin Tóth
    I store some data in files which follow this naming convention: /interesting/data/filename-YYYY-MM-DD-HH-MM How do I look for the ones with date in file name < now - 1 month and delete them? Files may have changed since they were created, so searching according to last modification date is not good. What I'm doing now, is filter-ing them in python: prefix = '/interesting/data/filename-' import commands names = commands.getoutput('ls {0}*'.format(prefix)).splitlines() from datetime import datetime, timedelta all_files = map(lambda name: { 'name': name, 'date': datetime.strptime(name, '{0}%Y-%m-%d-%H-%M'.format(prefix)) }, names) month = datetime.now() - timedelta(days = 30) to_delete = filter(lambda item: item['date'] < month, all_files) import os map(os.remove, to_delete) Is there a (oneliner) bash solution for this?

    Read the article

  • What's the fastest way to store/access large files?

    - by philfreo
    I do a lot of video editing on my Mac and need a way to store very large (30 GB) files, and don't have room on my HD. A USB/Firewire external hard drive would work, but it seems way too slow for consistently working with such large files. I've also considered buying another computer, with a large hard drive, and putting it on the same network with a shared folder. What's the fastest / most efficient way to do this? Please consider USB 2.0 speeds, hard drive read times, ethernet speeds, etc. Are there other options I should consider?

    Read the article

  • is a wildcard SSL the only option in this multiple VHOST/1IP setup?

    - by solsol
    I have a web app set up that needs the following SSL encryption: secure.myapp.com -> SSL www.myapp.com/login -> SSL www.myapp.com/signup -> SSL If I'm correct, I could run one SSL certificate for my whole www.myapp.com/* pages. The problem is that I have a subdomain called secure.myapp.com that either needs to be on a separate IP address to work with SSL. Right now I have one server, one public IP and a number of Virtual Hosts in apache to make this work. I'd rather not buy an expensive Wildcard SSL certificate to secure just one subdomain. What is your advice on this? If it IS the only solution any tips on getting a price worthy wildcard SSL cert is appreciated. I have read about SNI that allows the use of multiple SSL certs, but not all browsers (IE6!) support this. Since we are building a web app for the public, we cannot have IE6 to run on unencrypted connections. Thanks for you help

    Read the article

  • How do I extract files from one tarball to another tarball in one step?

    - by Martin
    I have some fairly large tarball archives, from which I need to extract some files. I will later repack those files to transfer them to another server. Currently that is a two (multi) step process for me: mkdir ttmp tar -vxzf large.tgz -C ttmp/ --strip-components=<INT> <folder-to-be-extracted> or alternatively with wildcards mkdir ttmp tar -vxzf large.tgz -C ttmp/ --strip-components=<INT> \ --wildcards --no-anchored '*pattern*' Then I go ahead and recompress the created folder tar -vczf small.tgz ttmp/* rm -rf ttmp How can I combine these two commands into one? Like this tar -x large.tgz > tar -c small.tgz Just to show, what I already tried: Whenever I search the terms "extract" I will end up here or here or even here. When I use the term "split" I will end up here and that is definitely not what I intend to do. When I use "repack" I end up in strange places.

    Read the article

  • How can I share files from my Windows 7 machine to my friend's Ubuntu machine?

    - by ProfKaos
    I run a Windows 7 Pro SP1 laptop as my home machine, and my housemate runs an Ubuntu 12.04.1.05 desktop. We share a WLAN. I would like to make certain locations and files available for him to read and maybe write. How can I go about this? Bearing in mind I have very little recent experience with modern Linux, and Ubuntu in particular. My first idea is to share a Windows folder with my Ubuntu VM under VMWare Player, then his Ubuntu machine can connect to my Unbuntu VM, and the two can use whatever magic Ubuntu uses to achieve file sharing. This requires my Ubuntu VM to be always running though, and that may not always be possible. I have also heard that Samba may have a feature to help here, but I know nothing about that. How can I share my Windows files with my mate's Ubuntu machine, preferably with a 1 to 1 connection, i.e. rather not using shim VM's

    Read the article

  • How do I recover files from a corrupt VDI file?

    - by Eric P
    Is it possible to repair a corrupt VDI file? The OS on the VDI (XP) doesn't boot at all, it just hangs at a black screen. I was getting file errors before on its last boot, but now its not working at all. Sector viewer shows 'Invalid partition table Error loading operating system Missing operating system'. I tried mounting the file from the host OS, but it just says that the drive isn't formatted. I don't need to be able to run the VDI, but I do need some files that are on it. Is there any way to recover files from the corrupt VDI file?

    Read the article

  • Is there any way to distribute x264 encoding jobs across multiple computers (to increase the encoding speed)?

    - by Breakthrough
    Does anyone know of a current, active solution to encoding x264 videos across many computers (via the network) to increase encoding FPS? Brownie points for cross-platform and open source, but just so you all know, I usually use Windows. Programs that I have heard of, and why I do not believe they are suitable: x264farm: Not actively developed. Good interface, but does not support two-pass encoding, and fails with newer x264 builds. ELDER: Again, not actively developed, but my issue was that it didn't work with new x264 builds, and it was very difficult to configure (read: randomly stopped working). While I don't absolutely need a program which is being actively developed, I would like one that supports two-pass encoding, and works with new(er) x264 builds. Additional information: So far, I've offered (and awarded!) two separate bounties on this question since I first posted it over two years ago, and I still haven't found a solution to this problem. What I'm looking for basically is a simple program to allow me to encode x264 videos using the processing power of multiple computers connected over a LAN. Furthermore, it would be nice if it worked with new(er) x264 builds, and supported two-pass encoding. If at any time someone has an updated answer, or a new solution to this problem, please post it and it will be given some consideration.

    Read the article

  • Seeking a solution to automatically copy files from the cd-rom disk to the USB drive once it's connected.

    - by Ray Nathan
    I plan to distribute a free CD that automatically copies files to a connected usb device. This process will be done on the computers of the users that obtain the cd. The CD will contain an autorun.ini file that will instruct the computer to copy a set of files located on the cd..to a specific directory on the connected usb device. The usb drive letter is not the same on all the systems, therefore...Windows XP should automatically know the drive letter of the usb device before the copy operation begins. What would be the best way of creating a short batch file or script that I can place on the CD to execute this process? Also, please note that it is NOT feasible or recommended to include a batch file on the USB devices to sync this operation due to the explanation at the beginning of this paragraph. :) Thank You All

    Read the article

  • What's the best way to clone multiple PCs from one machine?

    - by Jason T.
    Where I work we have dozens and dozens of old ThinkPad laptops. A lot of these can be reused but not for our needs. They have been long since replaced. The higher-ups have decided to donate them to charity. For better or for worse I have been tasked with reimaging them. I took a laptop and installed the factory copy of Windows, updated it, configured it appropriately. Now I'm trying to reimage it to dozens of other laptops. What's some good software to do this? First I used clonezilla to clone the hdd in the laptop to an internal drive in an external enclosure and it worked. Then I tried taking the base image out and connecting it externally to a laptop that needed to be imaged and I got it to work a few times. So far so good, right? Well once I informed my boss of my findings and what I would want to do then the images started to not work on new laptops. One of three things would happen: The Thinkpads would just blink at me and Windows wouldn't load. Or Windows would load but freeze within two minutes. Last but not least the laptops would BSOD during the Windows XP bootup. These laptops are not going to be used by the company. They're going to charity. So can anyone else recommend a way to reimage multiple laptops?

    Read the article

  • Why does the command forfiles list files that are not a day old despite the command being otherwise?

    - by PeanutsMonkey
    The command I am executing is forfiles -p"C:\testdata" -m*.* -d-1 -c"cmd /c echo @PATH\@FILE" I have specified that I wish to only list files that are a day old however when I execute the statement, it returns me a list of files that were created today. Why is that? Am I doing something wrong? Would it be better to specify a time period as opposed to a date e.g. 24 hours? The version of forfiles I have reads as follows FORFILES v 1.1 - [email protected] - 4/98. The batch file is being run on Windows XP.

    Read the article

  • How to print TIFF files using MSFT Office Document imaging?

    - by Think Floyd
    OS: Vista and Windows7 I have Microsoft Office Document Imaging installed. .tif and .tiff files association is set to " Microsoft Office Document Imaging" When I open a TIFF file, it opens in " Microsoft Office Document Imaging". Good so far. However, when I right-click on the TIFF file and invoke print, I see a "Print Pictures" dialog, ("How do you want to print your pictures?") I have some applications installed on my machine that print incoming TIFF files on the printer. They work fine on XP. However, on Vista and Windows7, I get this "Print pictures" prompt requiring an user intervention (i.e, click on Print button). How do I get rid of this "Print Pictures" prompt?

    Read the article

  • How do I set the umask for files and directories created from the GUI in MacOS X Lion (10.7)?

    - by Avry
    I've set my umask in my .bashrc file to 007. Any files created on the command line after loading my bashrc file respects this setting. I want to be able to set the umask to 007 for any files created using non-command line apps. This document talks about setting the umask via launchd. And it kind of works. If I follow these directions I can change the default permissions on a GUI created file from rw-r--r-- to rw-rw---- but the directories still are not group writeable (i.e. I want them to be rwxrwx--- but they are rwxr-x--- instead) The analog on Linux would be /etc/login.defs as the place to set the umask. What do I change in order for the umask to be set properly (i.e. the way I want it)?

    Read the article

  • How can I audit a Linux filesystem for files which have been changed or added within a specific time

    - by Bcos
    We are a website design/hosting company running several sites on a Linux server using Joomla 1.5.14 and recently someone was able exploit a vulnerability in the RW Cards component to write arbitrary files/modify existing files on our filesystem enabling them to do some nasty things to our customers sites. We have removed vulnerable modules from all sites but are still seeing some problems. We suspect that they still have some scripts installed and need a way to audit anything that has been changed or added in the last 10 days. Is there a command or script we can run to do this?

    Read the article

  • How can I evaluate the best choice of archive format for compressing files?

    - by Mehrdad
    In general, I've observed the following: Linux-y files or tools use bzip2 or gzip for distributing archives Windows-y files or tools use ZIP for distributing archives Many people use 7-Zip for creating and distributing their own archives Questions: What are the advantages and disadvantages of these formats, all of which appear to be open formats? When/why should I choose one (say, 7-Zip) over another (say, ZIP)? Why does the trend above appear to hold, even though all of these are portable formats? Are there any particular advantages to using a particular archive format on a particular platform?

    Read the article

  • What is the best filesystem for storing thousands of files in one dictionary-like id-blob structure?

    - by Ivan
    What filesystem best suits my needs? Thousands or even millions of files in one directory. Good (ext4 & ntfs level or close) reliability (incl. fault tolerance) and access speed. No directories actually needed, as well as descriptive names, just a dictionary-like structure of id-blob pairs is all I need. No links, attributes, and access control features needed. The purpose is a file storage where all the metadata (data describing all the facts about what the file actually contains and who can access it) is stored in a MySQL database. As far as I know common filesystems like NTFS and ext3/4 can go dead-slow if there are too many files placed in one directory - that's why I ask.

    Read the article

  • How do I catalog files on several external hard drives that I want to store off-line? OSX

    - by raudi
    My partner, an artist, has more than 10 external hardisks both USB and firewire and every 2-3 months a new one has to be added (She's working with videos and pictures) currently its 10TB and growing so too much for a affordable NAS. Right now the files are not indexed and I think can not be searched with spotlight because not all drives can be connected at the same time. So if she wants to search for a file, she has to guess which disk/disks (based mostly on the date) and then search several drives. Now I'm looking for a solution to index/catalog the drives, something like GentibusCD Cathy Disclib (all these solutions are unfortunately Windows only) Is there any software for OSX that will catalog all the hard drives, so she can search the catalog, find the files, and get the ID of the disk / disk name that has the content? Preferably something with a GUI so my partner can also use it easily Preferably with Thumbnails for pictures/videos (But even an equivalent to "tree /F /A" would be better than nothing)

    Read the article

  • Adding text to the beginning and end of a number of files?

    - by John Feminella
    I have a number of files in a directory hierarchy. For each file, I'd like to add "abcdef" to the beginning, on its own line, and "ghijkl" to the end, on its own line. For example, if the files initially contained: # one/foo.txt apples bananas # two/three/bar.txt coconuts Then afterwards, I'd expect them to contain: # one/foo.txt abcdef apples bananas ghijkl # two/three/bar.txt abcdef coconuts ghijkl What's the best way to do this? I've gotten as far as: # put stuff at start of file find . -type f -print0 | xargs -0 sed -i 's/.../abcdef/g' # put stuff at end of file find . -type f -print0 | xargs -0 sed -i 's/.../ghijkl/g' but I can't seem to figure out how what to put in the ellipses.

    Read the article

  • How can I resize images in multiple subdirectories more effectively?

    - by jtfairbank
    I have the original images in a directory structure that looks like this: ./Alabama/1.jpg ./Alabama/2.jpg ./Alabama/3.jpg ./Alaska/1.jpg ...the rest of the states... I wanted to convert all of the original images into thumbnails so I can display them on a website. After a bit of digging / experimenting, I came up with the following Linux command: find . -type f -iname '*.jpg' | sed -e 's/\.jpg$//' | xargs -I Y convert Y.jpg -thumbnail x100\> Y-small.jpg It recursively finds all the jpg images in my subdirectories, removes the file type (.jpg) from them so I can rename them later, then makes them into a thumbnail and renames them with '-small' appended before the file type. It worked for my purposes, but its a tad complicated and it isn't very robust. For example, I'm not sure how I would insert 'small-' at the beginning of the file's name (so ./Alabama/small-1.jpg). Questions: Is there a better, more robust way of creating thumbnails from images that are located in multiple subdirectories? Can I make the existing command more robust (for example, but using sed to rename the outputted thumbnail before it is saved- basically modify the Y-small.jpg part).

    Read the article

  • Apache crashing at random intervals. Can not find a reason in log files

    - by Nick Downton
    We are having an issue with a VPS running plesk 9.5 on ubuntu 8.04 At seemingly random intervals Apache will disappear and needs to be started manually. I have checked the apache error log, /var/log/messages, individual virtual host apache error files and cannot find anything that coincides with the time of the failure. dmesg is empty which is a bit odd. We have also had the psa service go down for no apparent reason but apache stay up. I'm at a loss to diagnose this really because all the log files I can find do not point to any issues. Are there any others I can look at? Memory usage sits at about 55% (out of 400mb) and it isn't a particularly high trafficed server. Any pointers as to where else I can find out what is going on would be very much appreciated. Nick

    Read the article

  • Why is ext3 so slow to delete large files?

    - by Janis Peisenieks
    I have a server, which makes an incremental backup of a system every night. Now on saturdays, there is a full backup. But after the full backup has finished, a script kicks in, that deletes the incrementals. Now, the script sometimes breaks, and it is because the incrementals are each about 10GB files, and sometimes takes too long for the script. Now could someone explain to me, or point me in the direction of a resource, that explains why ext3 is so slow to delete files, when compared to, lets say, NTFS? I know theses are 2 completely different file systems, but I'm really interested why is there such a big difference in deletion?

    Read the article

  • How could I portably split large backup files over multiple discs?

    - by sourcejedi
    Context: I make backups / archives, primarily of photos. I'm experimenting with Bup, which is designed for backup to hard disk. Basically it creates Git repos which include packfiles of up to 1GB. But I still need last-ditch backups to keep offline and move offsite (and keeping them on read-only media is good too!). What are the options for archiving and splitting large files over several discs like CDs (and reading them back!)? I'd prefer methods which will stay readable in future. are portable e.g. to Windows. have known simple implementations, so I could re-implement them myself if necessary. (Using Bup packs will stretch my robustness budget. So I want to be confident about how other parts of the system would behave). I heard split archives are possible with both ZIP and 7-Zip. Is that right?

    Read the article

  • What is a Windows text editor that will make it easy for me to have four text files open onscreen at once?

    - by Ascendant
    When brainstorming / planning I like to have four text files open onscreen at once: One for notes/stream of consciousness, one for action items to follow up on, one for a rough outline, etc.... What I'm looking for is an easy way to create / save four text files in this manner in Windows. Most importantly, I need the lines to wrap based on the width of the actual window itself. Not based on a ruler or document size (a la Word or WordPad) and not wrapping "manually only" (like Windows' built in Notepad application.) Also, I need the windows to have no, or at least, little, fluff at the top of each document (menubars, ribbons, etc.) On my Mac, I've found that the built-in TextEdit application is almost perfect for this. There's no header or ribbon taking up space for each document, and lines wrap when they hit the end of the window. I haven't had any luck finding a Windows application that works the same way.

    Read the article

< Previous Page | 483 484 485 486 487 488 489 490 491 492 493 494  | Next Page >