Search Results

Search found 50510 results on 2021 pages for 'static files'.

Page 525/2021 | < Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >

  • Backup Client/Server Software that Syncs only the Delta?

    - by Urda
    I have a co-located server, and a desktop computer. I push small things and large ammounts of small files (like my iTunes) into a JungleDisk cloud. If a few files change there, no big deal, the file gets re-uped. For larger files JungleDisk backup isn't helpful. Things like movies and VMware images that change a lot, but I want backed up. Just not to JungleDisk since that would cost me even more money. I am looking for a product, closed or open source (preferably open source) that will sync the change, or delta, to my personal server on a schedule. That way I can keep a copy of my larger things, without paying JungleDisk a ton more since they are in the range of many Gigabytes. Right now these few items are backed up over FTP, and take forever. Both the client and server are windows environments.

    Read the article

  • wsgi - narrow user permissions.

    - by Tomasz Wysocki
    I have following Apache configuration and my application is working fine: <VirtualHost *:80> ServerName ig-test.example.com WSGIScriptAlias / /home/ig-test/src/repository/django.wsgi WSGIDaemonProcess ig-test user=ig-test </VirtualHost> But I want to protect my files from other users, so I do: chown ig-test /home/ig-test/ -R chmod og-rwx /home/ig-test/ -R And application stops working: (13)Permission denied: /home/ig-test/.htaccess pcfg_openfile: unable to check htaccess file, ensure it is readable Is it possible to achieve what i'm doing with wsgi? If I have to give read permissions to some files it will be fine. But there are files I have to protect (like file with DB configuration or business logic of application).

    Read the article

  • re: 3ware raid 10 (4drive) suggested stripe size suggestions?

    - by dasko
    looked around on the site but nothing really concrete on my question. i will have about 120GB of data total, files are made up of 5MB files, excel, word and about 25 .pst files that are about 1.2GB each. Yes they use .pst over network, even though it is not recommended this is legacy setup without issue so we will continue to support this for another year or so. I need to know what you think about a stripe size of 256kb for the raid 10 based on the above requirements. I did try and bench with these settings and it seems alright without any real issue, just trying to rule out anything i might of missed. thanks.

    Read the article

  • How to fix Windows 2008 R2 BOOTMGR is missing

    - by cyberkiwi
    BOOTMGR IS MISSING PRESS CTRL+ALT+DEL TO RESTART Note: This is a VM on VMWare ESX server, but that should not matter I put in the 2008 R2 x64 install dvd and can get to recovery, but it lists no Operating Systems. Clicking on Next brings me to +=========================== System Recovery Options +=========================== Choose a recovery tool Operating system: Unknown or (Unknown) Local Disk ..... Command Prompt I start the command prompt, go to C:\ and perform a dir /a Apart from files I put there myself, these are showing $Recycle.Bin Documents and Settings [C:\Users] Program Files Program Files (x86) ProgramData Recovery System Volume Information Temp Users Windows Where to go next? Is it like the NTLDR problem with Windows 2003 where I can just drop a file in there and it will be hunky dory again?

    Read the article

  • Nginx configuration question

    - by Pockata
    Hey guys, i'm trying to make the autoindex feature only run for my ip address with this code: server{ ... autoindex off; ... if ($remote_addr ~ ..*.*) { autoindex on; } ... } But it doesn't work. It gives my a 403 :/ Can someone help me :) Btw, i'm using Debian Lenny and Nginx 0.6 :) EDIT: Here's my full configuration: server { listen 80; server_name site.com; server_name_in_redirect off; client_max_body_size 4M; server_tokens off; # log_subrequest on; autoindex off; # expires max; error_page 500 502 503 504 /var/www/nginx-default/50x.html; # error_page 404 /404.html; set $myhome /bla/bla; set $myroot $myhome/public; set $mysubd $myhome/subdomains; log_format new_log '$remote_addr - $remote_user [$time_local] $request ' '"$status" "$http_referer" ' '"$http_user_agent" "$http_x_forwarded_for"'; # Star nginx :@ access_log /bla/bla/logs/access.log new_log; error_log /bla/bla/logs/error.log; if ($remote_addr ~ 94.156.58.138) { autoindex on; } # Subdomains if ($host ~* (.*)\.site\.org$) { set $myroot $mysubd/$1; } # Static files # location ~* \.(jpg|jpeg|gif|css|png|js|ico)$ { # access_log off; # expires 30d; # } location / { root $myroot; index index.php index.html index.htm; } # PHP location ~ \.php$ { fastcgi_pass 127.0.0.1:9000; fastcgi_index index.php; fastcgi_param SCRIPT_FILENAME $myroot$fastcgi_script_name; include fastcgi_params; } # .Htaccess location ~ /\.ht { deny all; } } I forgot to mention that when i add the code to remove static files from my access log, the static files cannot be accessed. I don't know if it's relevant :)

    Read the article

  • SBSMonitoring.mdf reached limit

    - by Bastien974
    I have SBS 08 Standart. I have some Error in my Event Viewer with MSSQL$SBSMONITORING Event id 1105, 1827: Could not allocate space for object 'dbo.EventLog'.'PK_EventLog' in database 'SBSMonitoring' because the 'PRIMARY' filegroup is full. Create disk space by deleting unneeded files, dropping objects in the filegroup, adding additional files to the filegroup, or setting autogrowth on for existing files in the filegroup. CREATE DATABASE or ALTER DATABASE failed because the resulting cumulative database size would exceed your licensed limit of 4096 MB per database. I tried to schrink the database, worked for SBSMonitoring_log.LDF but nothing for the SBSMonitoring.mdf, still 4096MB. I don't know how to reinstall the monitoring. Thanks for your help.

    Read the article

  • MacOS X 10.6 Portable Home Directory sync fails due to FileSync agent crashing

    - by tegbains
    On one of our cleanly installed MacPro machines running MacOS X 10.6.6 connected to our MacOS X 10.6.6 Server, syncing data using Portable Home Directories fails. It seems to be due to the filesync agent crashing during the home sync. We get -41 and -8026 errors, which we are suspecting are indicating that there is too much data or filesync agent can't read the files. The user is the owner of the files and can read/write to all of the files. < Logout 0:: [11/02/04 13:10:42.751] Error -41 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2. (source = NO) < Logout 0:: [11/02/04 13:10:42.758] Error -8062 copying /Volumes/RCAUsers/earlpeng/Library/Mail/Mailboxes/email from old imac./Attachments/12081/2.2/[email protected]. (source = NO) < Logout 1:: [11/02/04 13:10:42.758] -[DeepCopyContext deepCopyError:sourceError:sourceRef:]: error = -8062, wasSource = NO: return shouldContinue = NO

    Read the article

  • Automatic Excel Script

    - by Thomas
    I am a 6th year medical student and I'm working on my thesis. I have no experience with programming whatsoever, a friend recommended me to post my question here. I am strugling with the following problem: I have data of 400 patients, stored in 400 different excel files. Each file contains 34 columns in a specific order, let's say A to Z. The order is the same in each of these 400 files. Now I need to a make a new excel document that contains the first column of each patient. So I need all the first columns of my 400 different excel files, lined up next to each other in a new document. Preferebally in the form of a automatic script. After that I want to do the exact same thing but for the second column, then the third and so on. This is probably a problem that has already been solved. Otherwise could someone help me out? You have my thanks!

    Read the article

  • Data Archiving vs not

    - by Recursion
    For the sake of data integrity, is it wiser to archive your files or just leave them unarchived. No compression is being used. My thinking is that if you leave your files unarchived, if there is some form of corruption it will only hurt a smaller number of files. Though if you archive, lets say all of your documents, if there is even the slightest corruption, the entire archive is unrecoverable. So whats the best way to keep a clean file system, but not be subject to data corruption.

    Read the article

  • Create Virtual Image of Laptop before Formatting

    - by Simon Mark Smith
    I have a 3 year old laptop running Windows XP that I used for business. Although I have not used the laptop in over a year, I now want to re-commission it with Windows 7 and a fresh install. Before I do the fresh install I want to create a Virtual Image of the laptop that I can keep and potentially run on my desktop machine should I ever need to access any of the old files/projects that it contains currently. I know that most people will say just copy the files over to your desktop, but my concern is the configuration of the laptop. I used to use it for development and it has older versions of Visual Studio, SQL Server, Active X controls etc, etc than I currently use so I really want to preserve the environment not just the files. So really I am asking what is the best tool-set/method to achieve this? I understand there are free VM tools available but I have never done this before and would appreciate any help.

    Read the article

  • Nginx Restart Issues

    - by heavymark
    All of the sudden when restarting Nginx I get the following error: Restarting nginx: [alert]: could not open error log file: open() "/var/log/nginx/error.log" failed (13: Permission denied) 2011/02/16 17:20:58 [warn] 23925#0: the "user" directive makes sense only if the master process runs with super-user privileges, ignored in /etc/nginx/nginx.conf:1 the configuration file /etc/nginx/nginx.conf syntax is ok 2011/02/16 17:20:58 [emerg] 23925#0: open() "/var/run/nginx.pid" failed (13: Permission denied) configuration file /etc/nginx/nginx.conf test failed On the front end part of the site loads but some files such as the CSS in particular are not loading. They exist on the server but when loading the resources directly in Chrome they say "Oops this page can't be found." I set a special group and user to run my apache files using suexec for my domain files. I think the nginx are owned by root however which I'm assuming is the problem but which nginx file ownerships would I change?

    Read the article

  • Can't Create New folder in my video folder

    - by tiki
    My OS is windows 7 Ultimate 32bit. My video folder location is C:\Users\User\Documents\Downloads\Video. I Can't create folder in the Video folder or see the files i saved. But I can see the files in the root location of the folder i.e by going to the folder from My Computer. I can create and find files when I go to the folder My Computer>C>Users>User>Documents>Downloads>Video. This is an unusual problem I have never seen. Even being a system Administrator I can't able to fix the problem. Please help me out. Thanks in advance

    Read the article

  • How to fix Windows 2008 R2 BOOTMGR is missing

    - by RichardTheKiwi
    BOOTMGR IS MISSING PRESS CTRL+ALT+DEL TO RESTART Note: This is a VM on VMWare ESX server, but that should not matter I put in the 2008 R2 x64 install dvd and can get to recovery, but it lists no Operating Systems. Clicking on Next brings me to +=========================== System Recovery Options +=========================== Choose a recovery tool Operating system: Unknown or (Unknown) Local Disk ..... Command Prompt I start the command prompt, go to C:\ and perform a dir /a Apart from files I put there myself, these are showing $Recycle.Bin Documents and Settings [C:\Users] Program Files Program Files (x86) ProgramData Recovery System Volume Information Temp Users Windows Where to go next? Is it like the NTLDR problem with Windows 2003 where I can just drop a file in there and it will be hunky dory again?

    Read the article

  • Apache - building extensions with apxs

    - by Brian
    Hello, Pardon the newbie question - I haven't worked with manually compiling Apache modules (or anything) before. I am trying to get the mod_concat module going. It seems simple enough - just requires downloading the mod_concat.c file and then running: axps -c mod_concat.c This is new to me. Does it matter which directory I put mod_concat.c before running this command? I ran it from my home directory, and I see some new files - mod_concat.la, mod_concat.lo, mod_concat.o, and mod_concat.slo - along with a new subfolder called .libs/ that contains mod_concat.so along with some other files. I'm not sure where to go from here, I have a feeling these files were created in the wrong place. Don't I need mod_concat.so to be in my apache modules directory with the rest? Thanks for the help, Brian

    Read the article

  • Apache stopping downloads part way through

    - by Ben Smiley
    On my site there are some digital files which can be downloaded through a PHP script. The script works fine for small files but large files i.e. 115MB cannot be downloaded successfully. The connection dies after around 15 minutes but it's not consistent - sometimes longer sometimes shorter. I don't think it's a problem with the script timing out because the download time isn't consistent. Equally it doesn't seem like a memory limit problem because the amount downloaded varies each time. Does anyone know of any Apache or PHP related settings which could cause this kind of problem?

    Read the article

  • IIS's SMTP Pickup timing

    - by fatcat1111
    I have IIS's SMTP server set up as a closed relay, and it's working nicely. I also have an application that writes EML files. If the EML files are written to a temporary directory, then moved to the server's Pickup directory, email is sent as expected. However, if I have the application write the EML files directly to the Pickup directory, the email will often fail to send. This seems to be a race condition: the server starts processing the EML file as soon as it detects it in Pickup, even though the application hasn't completed writing it. The result is the server considers the EML to be malformed, and it punts it to Badmail. While I very much appreciate the server's earnestness, it seems that I need to dial it back a bit for this scenario. Does anybody know if IIS's SMTP server's polling frequency can be configured? I am using IIS7, Windows Server 2008 R2. The application that writes the EML cannot be modified.

    Read the article

  • Tell browsers to cache until last modified date changes?

    - by Chad Johnson
    My web site consists of static HTML files which are usually republished once per day, and sometimes more. I'm using Apache. In the vhost settings for my site, I'd like to tell browsers to cache HTML files indefinitely, until Apache sees that they are modified. So as soon as an HTML file is changed, Apache should immediately begin telling browsers it's changed and send the updated file. As soon as a new file is published, browsers should immediately begin receiving that...they should never receive old versions of files. Maybe ExpiresByType text/html modification and no "plus x days." Is something like this possible?

    Read the article

  • Google App Engine says "Must authenticate first." while trying to deploy any app

    - by Oleksandr Bolotov
    Google App Engine says "Must authenticate first." while trying to deploy any app: me@myhost /opt/google_appengine $ python appcfg.py update ~/sda2/workspace/lyapapam/ Application: lyapapam; version: 1. Server: appengine.google.com. Scanning files on local disk. Scanned 500 files. Scanned 1000 files. Initiating update. Email: <my_email_was_here>@gmail.com Password for <my_email_was_here>@gmail.com: Error 401: --- begin server output --- Must authenticate first. --- end server output --- We are getting this message with any appliation and under any developer account avialable to us That's what we have installed: App Engine SDK - 1.3.2 PIL - 1.1.7 Python - 2.5.5 pip - 0.6.3 ssl - 1.15 wsgiref - 0.1.2 So, what can it be? Is it well known problem?

    Read the article

  • The specified module (mod_h264_streaming) could not be found (Apache2)?

    - by rphello101
    I'm trying to get the mod_h264_streaming to work with my Apache2 server. I downloaded a precompiled version of the mod from here. I read here that all I have to do is extract the file to my modules folder, which I did, and add LoadModule h264_streaming_module modules/mod_h264_streaming.so AddHandler h264-streaming.extensions .mp4 to the httpd.conf, which I also did. However, I get this error when I restart Apache: Syntax error on line 173 of C:/Program Files (x86)/Apache Group/Apache2/conf/httpd.conf: Cannot load C:/Program Files (x86)/Apache Group/Apache2/modules/mod_h264_streaming.so into server: The specified module could not be found. Note the errors or messages above, and press the <ESC> key to exit. 26... Even though the file exists right here: C:\Program Files (x86)\Apache Group\Apache2\modules\mod_h264_streaming.so Can anyone tell me what I'm doing wrong?

    Read the article

  • Copying windows 8 Users folder having long long paths

    - by bilal.haider
    I was trying to move my "Users" folder in Windows 8 as described here and here. But when I try to copy the folder using "xcopy" in Windows Installation Disk Repair Mode, after some files are copied, I get "insufficient memory". The files on which the error is given are like C:\Users\Bilal\Application Data\Application Data\Application Data.........Application Data\Application Data..... What is the point in such directories within directories? I also tried copying them using Mini Windows XP, but the problem was there too.. Also tried copying using Parted Magic Live CD... but still.. So now, how can I move them? Another Question. Is moving such/ system files using Linux a good idea? Does it do anything to permissions?

    Read the article

  • Linux file permissions not being preserved

    - by yellavon
    I am deploying some custom software as root (a necessity for this situation). I set the owner/group to user1:user1 and set all the files to 644 beforehand in shell, then copy and deploy with ant. However, when files get copied over from the deployment directory, the ownership changes back to root and all the files install with 666 permissions. This seems to occur whether the file is overwritten or newly created. I believe there is a way to set an option in cp, mv commands to preserve permissions, but that would be a lot of commands to change. How can I fix this? Is there some setting I can change temporarily for root so the install always preserves the file permissions?

    Read the article

  • Problem routing between directly connected Subnets w/ ASA-5510

    - by Zephyr Pellerin
    This is an issue I've been struggling with for quite some time, with a seemingly simple answer (Aren't all IT problems?). And that is the problem of passing traffic between two directly connected subnets with an ASA While I'm aware that best practice is to have Internet - Firewall - Router, in many cases this isn't possible. For example, In have an ASA with two interfaces, named OutsideNetwork (10.19.200.3/24) and InternalNetwork (10.19.4.254/24). You'd expect Outside to be able to get to, say, 10.19.4.1, or at LEAST 10.19.4.254, but pinging the interface gives only bad news. Result of the command: "ping OutsideNetwork 10.19.4.254" Type escape sequence to abort. Sending 5, 100-byte ICMP Echos to 10.19.4.254, timeout is 2 seconds: ????? Success rate is 0 percent (0/5) Naturally, you'd assume that you could add a static route, to no avail. [ERROR] route Outsidenetwork 10.19.4.0 255.255.255.0 10.19.4.254 1 Cannot add route, connected route exists At this point, you might gander if its a NAT or Access list problem. access-list Outsidenetwork_access_in extended permit ip any any access-list Internalnetwork_access_in extended permit ip any any There is no dynamic nat (or static nat for that matter), and Unnatted traffic is permitted. When I try pinging the above address (10.19.4.254 from Outsidenetwork), I get this error message from level 0 logging (debugging). Routing failed to locate next hop for icmp from NP Identity Ifc:10.19.200.3/0 to Outsidenetwork:10.19.4.1/0 This led me to set same-security traffic permit, and assigned the same, lesser and greater security numbers between the two interfaces. Am I overlooking something obvious? Is there a command to set static routes that are classified higher than connected routes?

    Read the article

  • Folder Redirection won't load on Windows 7 Machine in Windows 2008 R2 Network

    - by leeand00
    Okay so redirected profiles don't load exactly, but the computer is joined to the network and it won't display any of the users files on their desktop that are in their redirected profile. I know this because we have a Terminal Server and when the user logs in there, her files appear. I checked the users' profile in Active Directory Users and Computers and compared it with a working users profile. When that didn't turn up any differences, I looked at her computer and found that on the Dial-in tab the Network Access Permission wasn't set to Control access through NPS Network Policy like it was on the other machines on the network; so I selected it, ran gpupdate /force on her machine and rebooted. This did not fix the issue. Is there anything else that could be preventing the redirected files on the users desktop from showing up when the user logs in?

    Read the article

  • Mass remove passwords from rar archives

    - by ldigas
    Is there a way to (I'm using WinRAR; demo, but I'm willing to change it to whatever is needed) mass remove passwords from a bunch of files ? Problem description: for reasons unknown to me, some archiving was done for two-and-something years in RAR format, and all archives have passwords. I have a list of them, them all being similar (mostly something like John-03, John-04, John-05 ... e.g. name-month ...) but I need to manipulate the files at large, and it is a real problem removing and or dearchiving all those files, while entering passwords manually. What would be my best options concerning ? Ideally, I'm looking for some kind of archiver which tries out a predefined list of passwords, and asks only if non of them cracks the safe. Afaik, WinRAR has no such feature.

    Read the article

  • File not updating in symlink'd folder in IIS

    - by Daniel Short
    I have the following setup: Site1/Shared/ - Physical folder Site2/Shared/ - symlink using mklink to Site1/Shared I've updated a javascript file in Site1/Shared/scripts, and the change is being reflected on Site1. However, the change is not being reflected through IIS on Site2. When I open Site1/Shared/scripts/common.js and Site2/Shared/scripts/common.js, they match exactly. But when I view the files through Safari, Firefox, Chrome, IE, from any machine (even machines that have never visited the sites), the change is not reflected on Site2. Here are URLs to the files to review: Site 1: http://www.landsofamerica.com/shared/scripts/common.js Site 2: http://www.landsoftexas.com/shared/scripts/common.js These files look exactly the same when logged onto the server, and the shared folder under landsoftexas.com is a symlink created using mklink to the shared folder under landsofamerica.com. Any idea what might be causing IIS to serve the wrong file?

    Read the article

< Previous Page | 521 522 523 524 525 526 527 528 529 530 531 532  | Next Page >