Search Results

Search found 46908 results on 1877 pages for 'managing files and folder'.

Page 559/1877 | < Previous Page | 555 556 557 558 559 560 561 562 563 564 565 566  | Next Page >

  • Apache - building extensions with apxs

    - by Brian
    Hello, Pardon the newbie question - I haven't worked with manually compiling Apache modules (or anything) before. I am trying to get the mod_concat module going. It seems simple enough - just requires downloading the mod_concat.c file and then running: axps -c mod_concat.c This is new to me. Does it matter which directory I put mod_concat.c before running this command? I ran it from my home directory, and I see some new files - mod_concat.la, mod_concat.lo, mod_concat.o, and mod_concat.slo - along with a new subfolder called .libs/ that contains mod_concat.so along with some other files. I'm not sure where to go from here, I have a feeling these files were created in the wrong place. Don't I need mod_concat.so to be in my apache modules directory with the rest? Thanks for the help, Brian

    Read the article

  • Apache stopping downloads part way through

    - by Ben Smiley
    On my site there are some digital files which can be downloaded through a PHP script. The script works fine for small files but large files i.e. 115MB cannot be downloaded successfully. The connection dies after around 15 minutes but it's not consistent - sometimes longer sometimes shorter. I don't think it's a problem with the script timing out because the download time isn't consistent. Equally it doesn't seem like a memory limit problem because the amount downloaded varies each time. Does anyone know of any Apache or PHP related settings which could cause this kind of problem?

    Read the article

  • Tell browsers to cache until last modified date changes?

    - by Chad Johnson
    My web site consists of static HTML files which are usually republished once per day, and sometimes more. I'm using Apache. In the vhost settings for my site, I'd like to tell browsers to cache HTML files indefinitely, until Apache sees that they are modified. So as soon as an HTML file is changed, Apache should immediately begin telling browsers it's changed and send the updated file. As soon as a new file is published, browsers should immediately begin receiving that...they should never receive old versions of files. Maybe ExpiresByType text/html modification and no "plus x days." Is something like this possible?

    Read the article

  • IIS's SMTP Pickup timing

    - by fatcat1111
    I have IIS's SMTP server set up as a closed relay, and it's working nicely. I also have an application that writes EML files. If the EML files are written to a temporary directory, then moved to the server's Pickup directory, email is sent as expected. However, if I have the application write the EML files directly to the Pickup directory, the email will often fail to send. This seems to be a race condition: the server starts processing the EML file as soon as it detects it in Pickup, even though the application hasn't completed writing it. The result is the server considers the EML to be malformed, and it punts it to Badmail. While I very much appreciate the server's earnestness, it seems that I need to dial it back a bit for this scenario. Does anybody know if IIS's SMTP server's polling frequency can be configured? I am using IIS7, Windows Server 2008 R2. The application that writes the EML cannot be modified.

    Read the article

  • Server 2003 Remote Desktop loses its virtual printer image of the local printer

    - by Charles Hart
    Server 2003 Remote Desktop provides service to stores served by several ISPs. The server loses its virtual printer image of the local printer (as seen from the remote store site) and a copy of the original local printer appears on the local computer with a different driver without notice. Specifically: A remote desktop session is opened on a local computer that has a Brother HL2140 USB printer connected and the associated software installed with a correct driver shown under the “advanced” button. The server has the same Brother software and driver. An application that is running on the server attempts to print on the local printer connected to the local computer running Vista Pro or XP Pro. Either it works correctly (Good) or it does not print (Bad) or it prints on another Local Printer connected to another local computer logged into the server (Bad and Odd). When it doesn’t print (or prints somewhere else) we ask the customer to look for the (virtual) printer using the Remote desktop view of the server and the printer is gone. Then we ask the customer to look at the printers folder in the local computer. There are several possibilities: The printer is there, but the driver is mysteriously changed in the drop down to MDX something; we have the customer select the other (proper) Brother driver, and all is well again, as now after the change, the virtual printer in the server (which now matches the local printer) appears again, and so printing can resume. A “copy” of the printer mysteriously appears in the local printer’s folder and after we delete it the virtual printer in the server appears again and so printing can resume. Note that in both case 1 and 2, the server sometimes sends the print job elsewhere, to some other local computer. Meanwhile in the log file, endless errors are reported and the server eventually crashes, sometimes twice a day. I’m puzzled what changes the local printer driver and I’m puzzled what loads the copy 2 or copy 3 of the printer in the local printer folder. This entire description randomly occurs on any of 40+ local computers in eight different locations in different ISPs, all sharing one Domain.

    Read the article

  • Why is it bad to map network drives in Windows?

    - by Beeblebrox
    There has been some spirited discussion within our IT department about mapping network drives. In particular, it has been said that mapping network drives is A Bad Thing and that adding DFS paths or network shares to your (Windows Explorer/Libraries) Favourites is a far better solution. Why is this the case? Personally I find the convenience of z:\folder to be better than \\server\path\folder', particularly with cmd line and scripting (of course I'm not talking about hard-coded links, naturally!). I have tried searching for pros and cons of mapped network drives, but I haven't seen anything other than 'should the network go down, the drive will be unavailable'. But this is a limitation of any network-accessed storage... I have also been told that mapped network drives poll the network when the network resource is unavailable, however I haven't found more information on this. Wouldn't this still be an issue with other network access mechanisms (that is, mapped Favourites) whenever Windows tries to enumerate the file system (for example, when a file/folder picker dialog is opened)? -- Do network drives poll the network any more than a Windows Explorer library/favourite?

    Read the article

  • The specified module (mod_h264_streaming) could not be found (Apache2)?

    - by rphello101
    I'm trying to get the mod_h264_streaming to work with my Apache2 server. I downloaded a precompiled version of the mod from here. I read here that all I have to do is extract the file to my modules folder, which I did, and add LoadModule h264_streaming_module modules/mod_h264_streaming.so AddHandler h264-streaming.extensions .mp4 to the httpd.conf, which I also did. However, I get this error when I restart Apache: Syntax error on line 173 of C:/Program Files (x86)/Apache Group/Apache2/conf/httpd.conf: Cannot load C:/Program Files (x86)/Apache Group/Apache2/modules/mod_h264_streaming.so into server: The specified module could not be found. Note the errors or messages above, and press the <ESC> key to exit. 26... Even though the file exists right here: C:\Program Files (x86)\Apache Group\Apache2\modules\mod_h264_streaming.so Can anyone tell me what I'm doing wrong?

    Read the article

  • Google App Engine says "Must authenticate first." while trying to deploy any app

    - by Oleksandr Bolotov
    Google App Engine says "Must authenticate first." while trying to deploy any app: me@myhost /opt/google_appengine $ python appcfg.py update ~/sda2/workspace/lyapapam/ Application: lyapapam; version: 1. Server: appengine.google.com. Scanning files on local disk. Scanned 500 files. Scanned 1000 files. Initiating update. Email: <my_email_was_here>@gmail.com Password for <my_email_was_here>@gmail.com: Error 401: --- begin server output --- Must authenticate first. --- end server output --- We are getting this message with any appliation and under any developer account avialable to us That's what we have installed: App Engine SDK - 1.3.2 PIL - 1.1.7 Python - 2.5.5 pip - 0.6.3 ssl - 1.15 wsgiref - 0.1.2 So, what can it be? Is it well known problem?

    Read the article

  • Mass remove passwords from rar archives

    - by ldigas
    Is there a way to (I'm using WinRAR; demo, but I'm willing to change it to whatever is needed) mass remove passwords from a bunch of files ? Problem description: for reasons unknown to me, some archiving was done for two-and-something years in RAR format, and all archives have passwords. I have a list of them, them all being similar (mostly something like John-03, John-04, John-05 ... e.g. name-month ...) but I need to manipulate the files at large, and it is a real problem removing and or dearchiving all those files, while entering passwords manually. What would be my best options concerning ? Ideally, I'm looking for some kind of archiver which tries out a predefined list of passwords, and asks only if non of them cracks the safe. Afaik, WinRAR has no such feature.

    Read the article

  • Linux file permissions not being preserved

    - by yellavon
    I am deploying some custom software as root (a necessity for this situation). I set the owner/group to user1:user1 and set all the files to 644 beforehand in shell, then copy and deploy with ant. However, when files get copied over from the deployment directory, the ownership changes back to root and all the files install with 666 permissions. This seems to occur whether the file is overwritten or newly created. I believe there is a way to set an option in cp, mv commands to preserve permissions, but that would be a lot of commands to change. How can I fix this? Is there some setting I can change temporarily for root so the install always preserves the file permissions?

    Read the article

  • Backup Source (non source control)?

    - by acidzombie24
    I back up my code with svn. I have project files in there however i ignore selected things. I also ignore jpg, ogg, etc. Right now i would like to backup everything. However the zip result is 1gb (i have a lot of code). I know i can cut down the filesize by 60%+ Is there an app i can use which will backup everything except the bin and obj folders? perhaps keep ogg, json, jpg files but ignore .svn or .pdb files?

    Read the article

  • windows server backup 2008 R2 - what is generating all the change data?

    - by bobjandal
    We have a small relatively idle windows server 2008 R2 installation that does basic filesharing and exchange for about 10 not very active users. When running a windows server backup, the incremental data daily is about 20GB. This is not coming from users shared files, nor from changes in their mailbox sizes. The total size of the installation is 249GB, which is mostly old files. Where is all this data coming from, and how can I reduce it ? Using online backup of the vhd file from the backup is taking a while because of this daily change. Is there some way I can at least see what files are changing and contributing to this data ? Options I can think of but am not sure about: 1) pagefile churning - altho the backup does not include the pagefile, perhaps the changed blocks left behind are included ? 2) logs or something ? but the installation size stays the same every day 3) should I zero free space using sdelete before backing up perhaps ?

    Read the article

  • Can I list file names (or their parent directories) that were recently deleted using rm in OS X?

    - by Andrew Grimm
    Is it possible to find out which files and directories have recently been deleted by rm in OS X? Or failing that, is it possible to find which parent directories have had files or directories within it deleted? The OS version is Snow Leopard. Background: Last night, rvm (ruby version manager) did rm -rf of the ~/ruby directory from the home directory. (This bug has since been fixed) Ideally, I'd like to know what files within the ~/ruby directory were deleted, but failing that, I'd like to know if rvm deleted anything outside of ~/ruby . In case anyone's wondering about backups...: Just about everything within ~/ruby is a git project that has a remote repo, and I have a fairly recent Time Machine backup (only 20 days old).

    Read the article

  • Is there a free ftp client that has macros

    - by wheresrhys
    At the moment I'm using filezilla to deploy new versions of a site to the live server. the trouble is that there are one or two config, bootstrap etc. files which are different for the live site and I have to be careful not to overwrite. Also there are big areas of code that never change (eg I use the zend framework, which is always the same). I'd like to be able to record a macro to upload the same bunch of files and folders every time, excluding subdirectories and files which shouldn't be overwritten. Does any ftp client offer this?

    Read the article

  • Virtual PC lost parent disk for differential vhd

    - by SeeR
    2 years ago I had that brilliant idea to create base Windows XP disk which all of my VM with XP will use. Of course it ended that I had only one VM with XP :-). Today I needed to make some free space on my HDD so I found one not used VM named "Windows XP" which had only 5GB. I deleted it as fast as possible :-) and of course I used shift to not use "Recycle bin". Now when I want to run my XP VM I have following error: "One of the parent hard disks of ... is missing." It's not a problem for me as soon as I can restore files from this differential vhd that I have right now. So: I have differential disk with files I need I don't have parent disk My question is: How can I restore files from this differential hvd?

    Read the article

  • Limiting ssh user account only to access his home directory!

    - by EBAGHAKI
    By reading some tutorials online I used these commands: Make a local group: net localgroup CopsshUsers /ADD Deny access to this group at top level: cacls c:\ /c /e /t /d CopsshUsers Open access to the copSSH installation directory: cacls copssh-inst-dir /c /e /t /r CopsshUsers Add Copssh user to the group above: net localgroup CopsshUsers mysshuser /add simply put these commands will try to create a usergroup that has no permission on your computer and it only have access to the copSSH Installation directory. This is not true, since you cannot change the permission on your windows directory, the third command won't remove access to windows folder (it says access denied on his log). Somehow I achieved that by taking ownership of Windows folder and then i execute the third command so CopsshUsers has no permissions on windows folder from now on. Now i tried to SSH to the server and it simply can't login! this is kind of funny because with permission on windows directory you can login and without it you can't!! So if you CAN SSH to the server somehow you know that you have access to the windows directory! (Is this really true??) Simple task: Limiting ssh user account only to access his home directory on WINDOWS and nothing else! Guys please help!

    Read the article

  • Not able to delete file from server with permissions of 644 via PHP script

    - by letseatfood
    I am trying to delete JPEG files that were uploaded to the server via FTP. The files are uploaded and written with permissions of 644. The owner and group of the upload directory are mike and mike. I have tried changing the owner and group to www-data, but that does not seem to work. I am trying to delete the files with a PHP script using unlink(). This works on the production server (which is a hosting service), but not my development server, which is a LAMP setup. This leads me to believe it has something to do with permissions on my development server.

    Read the article

  • Cannot install netbeans on Gentoo

    - by siebz0r
    I'm trying to install Netbeans on my Gentoo system and I just cannot get it to work. When I'm compiling it It fails due to not being able to locate org.apache.maven.project.MavenProject. [nb-javac] class file for org.apache.maven.project.MavenProject not found I'm suspecting it has something to do with the installation not being able to find several files. However these are reported as warnings. The missing files have a version in their filenames, the files that are on the disk have a version number that is slightly off. So that explains why these cannot be found. The full output of the install can be found here: http://pastebin.com/43NS2ktz

    Read the article

  • How to synchronize two folders on two remote Linux virtual machines

    - by Manoj Agarwal
    I have two virtual machines, Host OS is ESXi 3.5 and guest OS is Centos 4.6. There are two ESXi servers remotely located, each containing a Centos 4.6 virtual machine. I wish, whatever change I make in any file/folder in one virtual machine should be automatically synchronized on other remote virtual machine. The synchronization process should be automatic. It should only sync differentials, not simulate entire copy with overwrite operation. Sync should be intelligent enough to look for what has changed and what not, and should only update the changed files/folders. Further, there should be some sort of overview and selection for syncing, for example, if it shows 4 files have changed, It should be possible to sync only two files and leave other two for the time being. So, some intelligent syncing mechanism for Linux is needed.

    Read the article

  • install grub on disk image

    - by Dima
    I have disk image with 2 partitions: Partition 1 has cramfs file system (read only). This partition contains all system files of the OS Partition 2 has ext3 file system. This partition has only configuration files that may be changed. How can I install GRUB1 boot loader on MBR. I tried to copy first 446 bytes of my hard disk and copy GRUB files to the /boot directory on the 1st (cramfs) partition. I cannot use grub-install because I have disk image and not disk itself. Any ideas?

    Read the article

  • Norton Backup "Failed to Restore"

    - by Teknophilia
    I recently had one of my computers (XP) die on me. I had it's files set to automatically backup to another PC's HD using Norton. I've tried using Norton restore on the second computer to try and restore some files (word documents, pictures), but when I try to do this, I get a dialog box saying that it "Failed to Restore". When I click to continue, it shows a list of the files I tried to restore, along with a status indicator for each file (which says "invalid file"). Any ideas?

    Read the article

  • Can't get powershell to return where results from GCI using ACL

    - by Rossaluss
    I'm trying to get Powershell to list files in a directory that are older than a certain date and match a certain user. I've got the below script so far which gives me all the files older than a certain date and lists the directory and who owns them: $date=get-date $age=$date.AddDays(-30) ls '\\server\share\folder' -File -Recurse | ` where {$_.lastwritetime -lt "$age"} | ` select-object $_.fullname,{(Get-ACL $_.FullName).Owner} | ` ft -AutoSize However, when I try and use an additional where parameter to select only files owned by a certain user, I get no results at all, even though I know I should, based on the match I'm trying to obtain (as below): $date=get-date $age=$date.AddDays(-30) ls '\\server\share\folder' -File -Recurse | ` where ({$_.lastwritetime -lt "$age"} -and {{(get-acl $_.FullName).owner} -eq "domain\user"}) | ` select-object $_.fullname,{(Get-ACL $_.FullName).Owner} | ` ft -AutoSize Am I missing something? Can I not use the get-acl command in a where condition as I've tried to? Any help would be appreciated. Thanks

    Read the article

  • Handling UTF-8 with BOM in HTTP

    - by Alois Mahdal
    Say I have a script which at some point serves a plain text file as a content (right after "\n\n"). These files are provided by users, but I can expect they will be UTF-8. So I hard-wire Content-Type: text/plain; charset=UTF-8. But while I can teach users to save everything in UTF-8, I can't be very sure that the files will be without BOM ("\xEE\xBB\xBF"), as at least on Windows, this is not very clearly distinguished in common plain text editors and not every one of them uses the same default. So what about these files created on Windows, where they may/may not start with BOM? Should/will server or UA get rid of this debris for me? Or is it my task to prepare clean UTF-8, i.e. open each file and check whether BOM needs to be removed?

    Read the article

  • How can I create multiple identical AWS EC2 server instances with large amounts of persistent data?

    - by mojones
    I have a CPU-intensive data-processing application that I want to run across many (~100,000) input files. The application needs a large (~20GB) data file in order to run. What I would like to do is create an EC2 machine image that has my application and associated data files installed boot up a large number (e.g. 100) of instances of this image split my input files up into 100 batches and send one batch to be processed on each instance I am having trouble figuring out the best way to ensure that each instance has access to the large data file. The data file is too big to fit on the root filesystem of an AMI. I could use Block Storage, but a given Block Storage volume can only be attached to a single instance, so I would need 100 clones. Is there some way to create a custom image that has more space on the root filsystem so that I can include my large data file? Or is there a better way to tackle this problem?

    Read the article

  • apache php access rights configuration

    - by AndreasT
    Hi, I am a complete apache and co newb. Currently it serves only the default page. On the default page, the user can not list the directory or files. When I create a directory, say /var/www/foobar and place files in it, the user can by doing: www.mydomain.org/foobar see the contents of the directory. I run pretty much the default configuration. on Directory "/", I have FollowSymlinks and AllowOverride(none) on what DocumentRoot points to I have Indexes FollowSymlinks MultiViews and "allow from all" set. My questions are: Can I stop people from listing subdirectories? Can people, if I do not change the configuration, in some way read the php files in there? (I mean not the rendered page, I mean the .php page source.) Pointers to good resources about this would also be nice. Thx in Advance.

    Read the article

< Previous Page | 555 556 557 558 559 560 561 562 563 564 565 566  | Next Page >