Search Results

Search found 50510 results on 2021 pages for 'static files'.

Page 524/2021 | < Previous Page | 520 521 522 523 524 525 526 527 528 529 530 531  | Next Page >

  • Can't figure out why hard drive is full [closed]

    - by Belgin Fish
    Possible Duplicate: How do I find out what is using up all the space on my / partition? No Free disk space so I have 2 hard drives in my server, one main one that is 10gb and then a separate one that is 2tb I'm storing all the files on the second one and the df -h output looks like this Filesystem Size Used Avail Use% Mounted on /dev/sda2 9.2G 8.8G 0 100% / tmpfs 1.5G 0 1.5G 0% /lib/init/rw udev 1.5G 148K 1.5G 1% /dev tmpfs 1.5G 0 1.5G 0% /dev/shm /dev/sda4 1.8T 747G 981G 44% /home /dev/sda4 1.8T 747G 981G 44% /usr/lib/cgi-bin I just can't figure out why the first one is full when all the files are being stored in the /usr/lib/cgi-bin I'm running debian I can't seem to find any files that would take up 8.8gb that arn't on the second hard drive :S Thanks!

    Read the article

  • Address bar in Finder?

    - by wag2639
    I'm used to knowing where all my files are (and I'm anal about it -- I don't need Mr Jobs thinking he knows best about where my files should go). Is there a way to get an address bar to show up in Finder in OSX (10.5+) like in Explorer in Windows or Nautilus in Gnome. Edit: I also want to be able to copy the address bar. Perhaps the workflow is different on a Mac, but I'm use to throughly sorting my files under many layers of folders and then when I need to upload or download something, or access a file in command line or etc, I can copy and paste that directly into the file dialog. To clarify, my goal is to have an experience like in Windows: press Ctrl + D (CMD + L) and Ctrl + C.

    Read the article

  • Overcrowded Windows XP Folders

    - by BlairHippo
    I know that, technically, an individual Windows XP directory can hold an immense number of files (over 4.29 billion, according to a quick Google search). However, is there a practical ceiling where too many files in one directory starts having an impact on reads to those files? If so, what factors would exacerbate or help the issue? I ask because my employer has several hundred XP machines in the field at client sites, and the performance on some of the older ones is getting "sludgy." The machines download and display client-defined images, and my supervisor and I suspect that our slacktastic approach to cache management could be to blame. (Some of the directories have tens of thousands of images in them.) I'm trying to gather evidence to support or contest the theory before spending time on a coding fix.

    Read the article

  • "I/O Error Occurred" in vSphere Client working with ESXi

    - by Chris
    I have a datastore set up in ESXi where I put all my ISOs. Somehow, something broke (I don't know what) and now I can't upload files to that (or any other) datastore. For large, ISO-sized files, the "Uploading..." dialog pops up, hangs for a while, and then the "I/O Error Occurred" displays. For smaller files (10 meg neighborhood), the "Uploading..." dialog comes up, a progress bar starts going, and it estimates a time remaining. Then it hangs at 1 second remaining for a while, and the same "I/O Error Occurred" comes up. Has anyone seen a problem like this?

    Read the article

  • I started getting a weird message "Encrypting file system - Back up your file encryption key"

    - by Ove
    I started getting a strange message when I start my computer. An icon appears in the system tray, and a popup tells me "Encrypting file system - Back up your file encryption key". I know what EFS is, but I don't use it. To my knowledge, I don't have any encrypted files on my partition. I have searched using Total Commander on all the partitions for files that have the "encrypted" attribute, but I found nothing. So I don't have any encrypted files. Does anyone know what I did to get this message?

    Read the article

  • How to do simple multitasked loop processing over filenames with PowerShell?

    - by Ville Koskinen
    I'm batch transcoding some 50 GB of video files on a USB hard disk which is connected to a wlan router. The drive is mapped as a network drive on my Windows 7 laptop. The speed handicap of the wlan causes some parts of the processing to become unnecessarily slow, so I would like to do the following with PowerShell: List the names of the files on the network drive to be transcoded Copy the first file to a temporary folder on my laptop Simultaneously Transcode the file in the folder Begin copying the next file from the network drive to the temporary folder After transcoding and copy have both ended, Delete the file which has been transcoded from the temporary folder Begin transcoding next file in the temporary folder Loop until all files have been processed How would I be able to do this with PowerShell? The multitasking part is an obstacle for my skill/persistence combination.

    Read the article

  • Debian: Should I add vlan interface into bridge for KVM?

    - by javano
    I am setting up a Debian Squeeze box as a KVM host. I want to add multiple interfaces to each KVM guest so I want them to be on different VLANs. After reading about this, I believe the best method is to add multiple logical VLAN (sub)-interfaces to the physical NICs and then create a bridge adapter for each VLAN interace, and assign each bridge as a NIC for KVM guests. Does this make good sense, or madness? Do I have to use bridged interfaces with KVM like this? Can't I just add eth1.xx and eth1.yy to my interfaces config below and then configure those directly as bridged KVM guest NICs? If so, how should this look in the interfaces config file below? user@host:~$ cat /etc/network/interfaces # This file describes the network interfaces available on your system # and how to activate them. For more information, see interfaces(5). # The loopback network interface auto lo iface lo inet loopback # Management Interface auto eth0 iface eth0 inet static address 172.22.0.31 netmask 255.255.255.0 gateway 172.22.0.1 # Interface for guest VMs auto eth1 # Guest1 : Use VLAN 117 auto eth1.117 iface eth1.117 inet manual # Set up br1 for guest 1, bridging with vlan 117 auto br1.117 iface br1.117 inet manual bridge_ports eth1.117 bridge_stp off user@host:~$ uname -a Linux hostname 3.4.9 #1 SMP Wed Aug 22 19:08:46 BST 2012 x86_64 GNU/Linux UPDATE I would really like it if someone could clarify the config for me, as I have also seen the above configured with this syntax, so I don't see why one would be preferred over the other; # Interface for guest VMs auto eth1 allow-hotplug eth1 iface eth1 inet static # Vlan 117 for guest 1 auto vlan 117 iface vlan111 inet static vlan_raw_device eth1 # Guest 1 : NIC 1 auto br1.117 iface br1.117 inet manual bridge_ports vlan117 bridge_stp off

    Read the article

  • You may be attempting to write into another users folder which is not supported...

    - by Triynko
    Why do I get this message when trying to compile a file in Flash MX 2004? "If you are logged in with a limited user account, then you may be attempting to write into another users folder which is not supported." The folder and files are owned by the Administrators group; however, the user trying to modify the files is a member of a group which is inheriting permissions to have full control over the files, except take ownership and change permissions. Why would a user, whose effective permissions include virtually everything, be unable to write to a file... and end up getting an error message like the one described?

    Read the article

  • Windows command line built-in compression/decompression tool?

    - by Will Marcouiller
    I need to write a batch file to unzip files to their current folder from a given root folder. Folder 0 |----- Folder 1 | |----- File1.zip | |----- File2.zip | |----- File3.zip | |----- Folder 2 | |----- File4.zip | |----- Folder 3 |----- File5.zip |----- FileN.zip So, I wish that my batch file is launched like so: ocd.bat /d="Folder 0" Then, make it iterate from within the batch file through all of the subfolders to unzip the files exactly where the .zip files are located. So here's my question: Does the Windows (from XP at least) have a command line for its embedded zip tool? Otherwise, shall I stick to another third-party util?

    Read the article

  • How to use BT or emule across 2 or more hard drives?

    - by the searcher
    One difficulty with BT or emule is that, when the hard drive is full, we constantly need to move older files to a new hard drive so that we can download newer files. We can change BT or emule's setting so that the folder for downloading points to the new hard drive, but then, what if emule haven't finished downloading for some files that are hard to find, and it is 92% done... in that case, we would like to keep the old setting so that when the last 8% arrives, it can go into the correct file. (and same for BT, if we haven't finished some file or if we want to seed something later). So is there a good way to let BT or emule point to 2 hard drives, or somehow let the new hard drive "merge" into the existing hard drive / folder?

    Read the article

  • Blocking a specific URL by IP (a URL create by mod-rewrite)

    - by Alex
    We need to block a specific URL for anyone not on a local IP (anyone without a 192.168.. address) We however cannot use apache's <Directory /var/www/foo/bar> Order allow,deny Allow from 192.168 </Directory> <Files /var/www/foo/bar> Order allow,deny Allow from 192.168 <Files> Because these would block specific files or directories, we need to block a specific URL which is created by mod-rewrite and the page is dynamically created using PHP. Any ideas would be greatly appreciated

    Read the article

  • Apache MaxClients doubt

    - by Milan Babuškov
    I have a busy Apache server serving only dynamic PHP+MySQL pages. It is a prefork Apache, version 2.2.4 with following config: KeepAlive off StartServers 8 MinSpareServers 32 MaxSpareServers 64 ServerLimit 512 MaxClients 512 MaxRequestsPerChild 4000 MaxClients/ServerLimit used to be set to 256, but I got the following error in error_log so I increased it: [error] server reached MaxClients setting, consider raising the MaxClients setting It seems to work now, but I have a doubt. Looking at MySQL log of queries, I have a couple hundred clients per seconds, but "ps ax" only shows 8, 9 or 10 processes running: [root@engine ~]# ps ax | grep http | wc -l 10 I even got this many processes when the above error message was shown in error_log. This made me investigate further. When I run netstat -a, I get something like this: tcp 0 0 engine:http adsl-105-143.teol.net:21453 TIME_WAIT tcp 0 0 engine:http 118-36.static.kds:mck-ivpip TIME_WAIT tcp 0 0 engine:http 118-36.static:oce-snmp-trap TIME_WAIT tcp 0 0 engine:http 118-36.static.kd:unifyadmin TIME_WAIT tcp 0 0 engine:http cable-188-2-25-29.dyna:4906 TIME_WAIT tcp 0 0 engine:http adsl-105-143.teol.net:21458 TIME_WAIT tcp 0 0 engine:http 109-92-83-91.dynamic.:62821 TIME_WAIT tcp 0 0 engine:http cable-89-216-142-192.:63576 TIME_WAIT tcp 0 0 engine:http 109-92-83-91.dynamic.:62819 TIME_WAIT tcp 1081 0 engine:http pttnetadsl38-36.ptt.r:50496 ESTABLISHED tcp 0 0 engine:http cable-188-2-36-196.dyn:4136 TIME_WAIT tcp 0 0 engine:http cable-89-216-142-192.:63580 TIME_WAIT tcp 0 0 engine:http cable-89-216-142-192.:63581 TIME_WAIT etc. When counting those, I get: [root@engine ~]# netstat -a | grep http | wc -l 431 Can anyone tell me what is really going on here and how to make sure my server keeps working, because I only use 50% of available RAM in machine?

    Read the article

  • How to change drag & drop behaviour in Windows 7's explorer?

    - by Pekka
    I have a new touch screen, and am playing around with its functionality. The most productive use for me is organizing files (literally) by hand. It's fun working through a list of files, dragging and dropping them to the right locations using your index finger. It feels better on the wrist than mouse-clicking, too. The only problem is that when I drag & drop files across drives in Windows 7, the default behaviour is to copy the file instead of moving it. I know I can influence this using right click, but that is of course no option in my situation. How can I change the default drag & drop behaviour in Windows 7's explorer?

    Read the article

  • SQL Maintenance Cleanup Task Working but Not Deleting

    - by Alex
    I have a Maintenance Plan that is suppose to go through the BACKUP folder and remove all .bak older than 5 days. When I run the job, it gives me a success message but older .bak files are still present. I've tried the step at the following question: SQL Maintenance Cleanup Task 'Success' But not deleting files Result is column IsDamaged = 0 I've verified with the following question and this is not my issue: Maintenance Cleanup Task(s) running 'successfully' but not deleting back up files. I've also tried deleting the Job and Maintenance Plan and recreating, but to no avail. Any ideas?

    Read the article

  • Apache2 enable .ini mod in /etc/php5/mods-available

    - by GuiTeK
    One can use the a2enmod [module] command to enable mods located in /etc/apache2/mods-available. But what about mods in /etc/php5/mods-available? When I try to enable a mod in this directory (eg. xdebug), I get the following error: ERROR: Module xdebug does not exist! Yet, /etc/php5/mods-available/xdebug.ini exists. I understand a2enmod may work only with *.load files (it makes sense since *.ini files are just configuration files) but then what's the correct way of enabling modules located in /etc/php5/mods-available?

    Read the article

  • IIS 6.0 FTP Folder Permissions

    - by Beuy
    I have a IIS FTP website setup like this \ftp\users\domain\public\public Software that runs on clients computers logs into the FTP server by specifying domain\public and moving to public, it then uploads or downloads files / folders into that area. I want to restrict permissions on \ftp\users\domain\public so that nothing/nobody can write files or folders here, only to \ftp\users\domain\public\public. I setup the NTFS permissions of the folder to remove domain\users, public and server\users to not have modify right, yet I can still upload/modify files. I have disabled inheritance from the parent folder of \ftp\users\domain\public as well. Any ideas on what I'm missing here? P.S I know this is a stupid setup and makes no sense, it's some bizarre legacy application that I need to migrate to a safer environment until it can be replaced.

    Read the article

  • Git pull with unstaged changes

    - by Peter
    Attempting a git pull when you have unstaged changes will fail, saying you can commit or stash then. I suppose a workaround is to git stash, git pull, then git stash pop. However, is there an alternative way to do this? I would like to forcefully git pull if there are unstaged changes, but only if the files being brought down do not override the modified files? AKA. if I have a repo with the files "derp1", "derp2", "derp3" and modify "derp1" locally, a git pull will bring down and overwrite everything except the "derp1" file. I assume a git stash + pull + stash pop achieves this already? And is there a better way? I suppose this could also work differently if it occurs on a submodule.

    Read the article

  • Apache only transferring partial content from a Samba share

    - by thaBadDawg
    I have an Apache server running on CentOS 5.3. It currently hosts 12 sites with no known issues. (I say this to point out that up to this point my Apache installation has performed flawlessly) I'm adding a new site where the DocumentRoot of the new VirtualHost is a Samba share. When at the command line of the server I can cp video.m4v ~ and the whole file is copied properly to my home directory. But when I try to access the file from IE/Firefox/Safari/Chrome it only passes back a partial result of 33k. The same thing is happening with my image and audio files. If I make the files local to the server by copying them from the share and then serving them up then the files transfer. Any ideas?

    Read the article

  • Extract and install Word 2003 standalone without full CD

    - by pcampbell
    Given a proper Office 2003 CD, is it possible to extract just the files that are needed for one application... i.e. Word or Excel? Browsing the CD, you can see WORD11.MSI. The goal here is to extract just the necessary bits to install the one app. Disk space isn't the concern, but rather the larger question of 'is it possible' and how? Is it possible to copy those files from the CD to another location to allow the installation of just one application? What files would be required from the CD to accomplish this?

    Read the article

  • IIS: redirect everything to another URL, except for one Directory

    - by DrStalker
    I have an IIS server (IIS 6, Win 2003) that hosts the site http://www.foo.com. I want any request to http://foo.com (no matter what path/filename is used) to redirect to http://www.bar.org/AwesomePage.html UNLESS the request is for http://www.foo.com/specialdir, in which case the HTML files in the local directory specialdir should be used. The problem I have is once the redirect is set it also affects /specialdir - even if I right click on that directory and select "content should come from ... local directory" that change does not take effect, and the directory still shows as redirecting to http://www.bar.org/AwesomePage.html. The same thing happens if I try to set individual files to load from the local system instead of redirecting - IIS gives no error, but the change does not take effect and the files still show as being redirected. How can I set specialdir to override the redirection to the new URL?

    Read the article

  • Clean URLS on Hiawatha

    - by Botto
    I am using the Hiawatha web server and running drupal on a FastCGI PHP server. The drupal site is using imagecache and it requires either private files or clean urls. The issue I am having with clean urls is that requests to files are being rewritten into index.php as well. My current config is: UrlToolkit { ToolkitID = drupal RequestURI exists Return Match (/files/*) Rewrite $1 Match ^/(.*) Rewrite /index.php?q=$1 } The above does not work. Drupal's apache set up is: <Directory /var/www/example.com> RewriteEngine on RewriteBase / RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule ^(.*)$ index.php?q=$1 [L,QSA] </Directory>

    Read the article

  • Booting Linux from External HDD, with persistence

    - by Moriarty
    I am trying to install Linux, specifically Lubuntu or BackTrack 5 on an external HDD (Seagate FreeAgent GoFlex) but I have had no luck using YUMI, or Untebootin to get it working. I want the hard drive to be able to save the data within Linux (As in, If I install a program, it will stay there). I also tried doing this with a flash drive, which does boot, but it does not save data (I tried following Pendrive's tutorial on creating a casper-rw file and adding "persistent" to various files, but I cannot get it to save files. Basically, I just want a form of linux on a portable device that will save files and settings between boots Note: I do not have a CD to install from. Any help would be greatly appreciated, Thanks!

    Read the article

  • Permission issue for apache

    - by Aamir Adnan
    Environment Details: Amazon Ec2 Ubuntu 12.04 Django + mod_wsgi + python 2.6 web server: apache2 I have mounted a 10GB ebs volume to an instance to /mnt/ebs1/. After mounting the volume and formatting, I have placed all my project files in /mnt/ebs1/project. the wsgi file is in /mnt/ebs1/project/apache/django.wsgi. The content of wsgi file is: import os, sys sys.path.insert(0, '/mnt/ebs1/project') sys.path.insert(1, '/mnt/ebs1') os.environ['DJANGO_SETTINGS_MODULE'] = 'project.configs.common.settings' import django.core.handlers.wsgi application = django.core.handlers.wsgi.WSGIHandler() My httpd.conf file looks as: LoadModule wsgi_module /usr/lib/apache2/modules/mod_wsgi.so WSGIPythonHome /usr/bin/python2.6 WSGIScriptAlias / /mnt/ebs1/project/apache/django.wsgi <Directory /mnt/ebs1/project> Order allow,deny Allow from all </Directory> <Directory /mnt/ebs1/project/apache> Order allow,deny Allow from all </Directory> Alias /static/ /mnt/ebs1/project/static/ <Directory /mnt/ebs1/project/static> Order deny,allow Allow from all </Directory> The above configurations gives me Forbidden: You don't have permission to access / on this server. I tried to find the user which is running apache using ps aux which is www-data and has group www-data. I have tried to change the ownership of /mnt/ebs1 and its subdirectories using chown -R www-data:www-data /mnt/ebs1 but that still does not solve the problem. Can any one tell me what I am doing wrong or have missed?

    Read the article

  • Windows 7 - sharing file - slow seek time?

    - by progtick
    I do not want to copy the files. I simply have video files (.flv) on one computer and I would like another computer user to watch them without copying. Playback is fine, but seek time (as in if you move the cursor to skip some portion of the video) takes forever! I thought wireless speed might be the culprit, so I wired the two computer. Maybe I saw some improvement but still so bad. It's 1 Gbps! (I know real speed will vary, but before I monitor real speed and such, do I have reasonable issue? Or am I bound to have very slow seek time?) What is going on? I must mention some of these files are huge!

    Read the article

  • Backup Client/Server Software that Syncs only the Delta?

    - by Urda
    I have a co-located server, and a desktop computer. I push small things and large ammounts of small files (like my iTunes) into a JungleDisk cloud. If a few files change there, no big deal, the file gets re-uped. For larger files JungleDisk backup isn't helpful. Things like movies and VMware images that change a lot, but I want backed up. Just not to JungleDisk since that would cost me even more money. I am looking for a product, closed or open source (preferably open source) that will sync the change, or delta, to my personal server on a schedule. That way I can keep a copy of my larger things, without paying JungleDisk a ton more since they are in the range of many Gigabytes. Right now these few items are backed up over FTP, and take forever. Both the client and server are windows environments.

    Read the article

< Previous Page | 520 521 522 523 524 525 526 527 528 529 530 531  | Next Page >