Search Results

Search found 101795 results on 4072 pages for 'encrypting file system'.

Page 391/4072 | < Previous Page | 387 388 389 390 391 392 393 394 395 396 397 398  | Next Page >

  • server shrinks html file to 0bytes on upload

    - by user38714
    Hi there, I have come across a very unusual problem. I am working on a website at the moment and whenever I upload a file to the server, it becomes a 0kb file. I have changed ftp software to filezilla to see would this help and it hasn't. I have compared it to other sites I am working on the permissions are the same numbers 0644 but on the site that I am having the trouble with the permissions have the prefix- flcdmpe(0644). Would this be the problem and if so any ideas how to change it?, changing the permissions doesn't work. I have been onto the host company and they cannot figure it out at all. Any help would be appreciated. Cheers Emma

    Read the article

  • Cisco VPN and .pcf file

    - by yael
    I have few of profiles .pcf files , and I used them in order to automate the vpmclient connection VIA CLI command I have WIN XP server for example vpmclient connect "customor_alpha" until now everything is ok but I have problem with the last of my profiles - area1.pcf the problem is when I type in CMD window the following ( to create VPN connection ) vpmclient connect "area1" after 2 second CISCO window will pop up and ask for password , ( username already defined in window ) please advice what could be the problem , why I get the "CISCO PVN window" ? or maybe I have some in correct syntax in my .pcf file , I checked the .pcf file again and again and I couldn't find the problem ? example of area1.pcf ( only example - not my real pcf ) [main] Description=connection to TechPubs server Host=10.10.99.30 AuthType=1 GroupName=docusers GroupPwd= enc_GroupPwd=158E47893BDCD398BF863675204775622C49<SNIPPED> EnableISPConnect=0 ISPConnectType=0 ISPConnect= ISPCommand= Username=alice SaveUserPassword=0 UserPassword= enc_UserPassword= NTDomain= EnableBackup=1 BackupServer=Engineering1, Engineering2, Engineering 3, Engineering4 EnableMSLogon=0 MSLogonType=0 EnableNat=1 EnableLocalLAN=0 TunnelingMode=0 TCPTunnelingPort=10000 CertStore=0 CertName= CertPath= CertSubjectName SendCertChain=0 VerifyCertDN=CN=”ID Cert”,OU*”Cisco”,ISSUER-CN!=”Entrust”,ISSURE-OU!*”wonderland” DHGroup=2 PeerTimeOut=90 ForceNetLogin=

    Read the article

  • APC File Cache not working but user cache is fine

    - by danishgoel
    I have just got a VPS (with cPanel/WHM) to test what gains i could get in my application with using apc file cache AND user cache. So firstly I got the PHP 5.3 compiled in as a DSO (apache module). Then installed APC via PECL through SSH. (First I tried with WHM Module installer, it also had the same problem, so I tried it via ssh) All seemed fine and phpinfo showed apc loaded and enabled. Then I checked with apc.php. All seemed OK But as I started testing my php application, the stats in apc for File Cache Information state: Cached Files 0 ( 0.0 Bytes) Hits 1 Misses 0 Request Rate (hits, misses) 0.00 cache requests/second Hit Rate 0.00 cache requests/second Miss Rate 0.00 cache requests/second Insert Rate 0.00 cache requests/second Cache full count 0 Which meant no PHP files were being cached, even though I had browsed through over 10 PHP files having multiple includes. So there must have been some Cached Files. But the user cache is functioning fine. User Cache Information Cached Variables 0 ( 0.0 Bytes) Hits 1000 Misses 1000 Request Rate (hits, misses) 0.84 cache requests/second Hit Rate 0.42 cache requests/second Miss Rate 0.42 cache requests/second Insert Rate 0.84 cache requests/second Cache full count 0 Its actually from an APC caching test script which tries to retrieve and store 1000 entries and gives me the times. A sort of simple benchmark. Can anyone help me here. Even though apc.cache_by_default = 1, no php files are being cached. This is my apc config Runtime Settings apc.cache_by_default 1 apc.canonicalize 1 apc.coredump_unmap 0 apc.enable_cli 0 apc.enabled 1 apc.file_md5 0 apc.file_update_protection 2 apc.filters apc.gc_ttl 3600 apc.include_once_override 0 apc.lazy_classes 0 apc.lazy_functions 0 apc.max_file_size 1M apc.mmap_file_mask apc.num_files_hint 1000 apc.preload_path apc.report_autofilter 0 apc.rfc1867 0 apc.rfc1867_freq 0 apc.rfc1867_name APC_UPLOAD_PROGRESS apc.rfc1867_prefix upload_ apc.rfc1867_ttl 3600 apc.serializer default apc.shm_segments 1 apc.shm_size 32M apc.slam_defense 1 apc.stat 1 apc.stat_ctime 0 apc.ttl 0 apc.use_request_time 1 apc.user_entries_hint 4096 apc.user_ttl 0 apc.write_lock 1 Also most php files are under 20KB, thus, apc.max_file_size = 1M is not the cause. I have also tried using 'apc_compile_file ' to force some files into opcode cache with no luck. I have also re-installed APC with Debugging enabled, but nothing shows in the error_log I have also tried setting mmap_file_mask to /dev/zero and /tmp/apc.xxxxxx, i have also set /tmp permissions to 777 to no avail Any clue anyone. Update: I have tried following things and none cause APC file cache to populate 1. set apc.enable_cli = 1 AND run a script from cli 2. Set apc.max_file_size = 5M (just in case) 3. switched php handler from dso to FastCGI in WHM (then switched it back to dso as it did not solve the problem) 4. Even tried restarting the container

    Read the article

  • rsync for copying file

    - by vinayrks
    I am migrating my old server to new server . I used this server for hosting website . first I tried sftp but due to huge number of files and connection time out , it simply didn't work . then I tried rsync .rsync working good , but only problem I am facing it updating file very nicely & fastly but do not copy new files please help me . because still i need to transfer lots of file. I am using this command : rsync -anv -e ssh oldserver:/path/ /path

    Read the article

  • My PC becomes very slow after changing page file in XP.

    - by Jane
    Hello there, I have 5 partitions in my computer (C, D, E, F, G), and C: is the system partition. Recently I changed the page file setting to use 768-768 MB from the G:, without changing the C: default value (256-512 MB). Everything runs fine until I rebooted my PC. I unset the G:'s page file before I rebooted my PC. Well, after I rebooted my PC, my XP becomes VERY slow. damn slow. I don't know how long I wasted to wait for the boot screen to disappear. (Well, maybe about 1-3 mins. Normally it is under 10 secs) and when I tried to enter the desktop, it just like using a computer without a VGA card, or VGA driver installed. Very slow. Any suggestions to fix this problem? Because I hate to repair-install my XP =\ Thanks.

    Read the article

  • post-receive hook permission denied "unable to create file" error

    - by ThomasReggi
    Just got gitolite installed on my webserver and am trying to get a post-receive hook that can point the git dir in apache's direction. This is what my post-receive hook looks like. Got this script from the Using Git to manage a web site. #!/bin/sh echo "post-receive example.com triggered" GIT_WORK_TREE=/srv/sites/example.com/public git checkout -f This is the error response i'm getting back from git push origin master from my local workstation. These are files from within my repository. remote: post-receive example.com triggered remote: error: unable to create file .htaccess (Permission denied) remote: error: unable to create file .tm_sync.config (Permission denied) remote: fatal: cannot create directory at 'application': Permission denied Permissions of public. drwxr-xr-x 5 root root 4096 Jun 26 17:23 public

    Read the article

  • rsync for coping file

    - by vinayrks
    I am migrating my old server to new server . I used this server for hosting website . first I tried sftp but due to huge number of files and connection time out , it simply didn't work . then I tried rsync .rsync working good , but only problem I am facing it updating file very nicely & fastly but do not copy new files please help me . because still i need to transfer lots of file. I am using this command : rsync -anv -e ssh oldserver:/path/ /path

    Read the article

  • Get sessions' remote IP from Teamviewer log file

    - by etuardu
    I'd like to know who has logged in to my machine and when. I have two TeamViewer log files: Connections_incoming.txt and TeamViewer7_Logfile.log. The first one is quite plain and lists, as its name says, the incoming connections to the machine, reporting the local name of the remote host, login time, logout time, and some ids. e.g.: 173274362 MYLAPTOP 20-02-2012 17:32:16 20-02-2012 17:50:42 Master RemoteControl {C5AAE483-ED0B-54B8-9235-7AE597CAD342} This is almost all what I need, but unfortunately no remote IP address is reported here, so I checked for IPs in TeamViewer7_Logfile.log but it is really messy. It indeed contains some IP addresses but I can't understand which one is bound with the items in the first log file. Is there a way to interpolate the two logs to get what I need? Should I search the second file for some particular text? What do you suggest?

    Read the article

  • Cropping a PDF File's Margin During Printing

    - by JavaMan
    I'm using the free Acrobat Reader to print out some pdf documents having very large top/bottom/left/right margins. I want to remove the margins (which are wasting too much space and making the fonts too small). I used to use Acrobat (the paid version having edit features) to crop the src pdf file manually. But since it is an old version it does not support new pdf format and I don't want to upgrade for such a simple use. Is there any free way to crop/remove unwanted white margins from the printed pdf? I am thinking to print the pdf files to a PDF Printer like the Bullzip PDF Printer and enlarge the output file manually so as to remove any white margin. But there does not seem to be such a feature in Bullzip PDF Printer. Is there any other virtual printer software that can be used for this purpose?

    Read the article

  • Finding Image resolution in PDF file?

    - by Dave
    I have a problem of having some users creating very large PDFs. On the other hands I have PDF sent from our fax machines that are really small in size and totally printable. My question is Is there any way I can find the resolution (DPI) of the PDF. I search the internet, could not find any answer. Checked the properties of the file, this information was not stored there, at least in my case. What is the optimum resolution of converting text file into image PDF. 96dpi, 300dpi or more ? Fun question. Can I resize a PDF which was scanned with high dpi into smaller dpi? I know some answers might not be available as I have already searched the internet and could not find answers. Note: My PDF are entirely images, text to images. I am also familiar with primoPDF (free) something you can experiment with

    Read the article

  • Temporary file storage - script for webserver

    - by Chris
    I'm looking for a script (preferably php) which I can use for to temporarily exchange files. I had such a script before (it was written in flash), but - damn - just can't find it anymore. Here the features I'm looking for: I have a logon, which allows me to create "upload spaces". for these "upload spaces" I define a time how long the space will be available for (until the files are deleted), and who can access it - in the means of typing in an email, and that user gets a link with the "online space", user ID and password. the other user then clicks on the link and uploads the file, and I get (preferably) an email should there be any file changes I get an email before the content gets deleted Now, 3 additional things: - it should be open- source - run on linux (preferably lamp) - no, I dont wanna use dropbox or similar :p Thanks in advance everybody, Cheers Chris

    Read the article

  • Outlook 2010: Cached Exchange Mode, File Storage and Security

    - by dangowans
    I'm in an environment where profile space is a premium, and most users have "frozen" machines, meaning that on restart, the C: drive is returned to its original state. Cached Exchange Mode sounds interesting to me, but I'm wondering if we can take advantage of it without causing other issues. Where in the file system does the cached data get stored? Is it in the profile? A temp folder? Is the cached file secured in some way to keep others from seeing it?

    Read the article

  • Add TOC to PDF from XML/JSON/file?

    - by Elias
    I currently have a PDF file without any ToC (for example, in Mac's Preview.app, I can't see the ToC in the sidebar). But I have the TOC in XML format, where there is a title and a pagenumber where that section starts. Is there any way I can add that TOC to my PDF file in a batch way? Since I have the TOC in XML, I can basically parse it in any possible way, so if there were a command line to add an TOC item to a PDF, I could also do that. Any ideas?

    Read the article

  • Moving the OS X swap file to a faster drive

    - by Milky Joe
    I have a new Mac Mini that's running the latest version of Snow Leopard. The internal drive is a bit of a slouch. I'd like to move the swap file (or whatever it's called is OS X) to my faster external drive (Firewire 800, permanently connected). Is this possible? I've read that the old solutions aren't working in 10.6. My Mac has 2GB of RAM, so the swap file is used quite a bit when I'm doing intensive work (Photoshop etc).

    Read the article

  • SQL restore from single file db to filegroup

    - by Mauro
    I have a 180GB MOSS 2007 database whose maintenance (i.e. backups and restores) are becoming a problem. Part of the issue can be resolved by splitting the three content sites down into their own site collections, however this will likely still leave me with a 100gb DB to deal with. Whilst this isnt entirely problematic for SQL it does mean that backups / restores take far too long. my idea is to split each of the databases to 30gb files, then to import the content into them which should distribute the content across the file groups,making it much easier / faster to backup/restore. Is there a way to backup from a single file and restore to a filegroup? If i have the wrong understanding of filegroups then I'm more than happy to find out other methods of managing the size of databases.

    Read the article

  • Copy a file from source directory to target base directory and maintain source path

    - by Citizen Dos
    Forgive me, I am probably not using the right terms to describe the problem and misunderstanding the most basic usage for a couple of common commands. I have a simple find statement that is locating files that I want to copy. I want to tack on the -exec cp {} and have cp copy the file from the source directory to a new base directory, but include the full path. For example: "find . -name *.txt" locates /user/username/projects/source.txt "cp {} [now what?]" copies the file to /user/newuser/projects/source.txt

    Read the article

  • File not updating in symlink'd folder in IIS

    - by Daniel Short
    I have the following setup: Site1/Shared/ - Physical folder Site2/Shared/ - symlink using mklink to Site1/Shared I've updated a javascript file in Site1/Shared/scripts, and the change is being reflected on Site1. However, the change is not being reflected through IIS on Site2. When I open Site1/Shared/scripts/common.js and Site2/Shared/scripts/common.js, they match exactly. But when I view the files through Safari, Firefox, Chrome, IE, from any machine (even machines that have never visited the sites), the change is not reflected on Site2. Here are URLs to the files to review: Site 1: http://www.landsofamerica.com/shared/scripts/common.js Site 2: http://www.landsoftexas.com/shared/scripts/common.js These files look exactly the same when logged onto the server, and the shared folder under landsoftexas.com is a symlink created using mklink to the shared folder under landsofamerica.com. Any idea what might be causing IIS to serve the wrong file?

    Read the article

  • Open and scroll through 42 GB text file in Mac OS X

    - by Django Johnson
    I am running Mac OS X 10.8.4 (Mountain Lion) and I am trying to open and scroll through a 42 GB .XML file. I plan on using an XML parser to parse through it and delete parts, but first I need to know how the document is structured so I can know what parts to save. How can I open this text / XML file and scroll through it so I can get a glimpse of its structure? I tried my default text-editor, text-mate, and that couldn't open it. I tried gEdit and that shows the first 10 or so lines, but then quits after trying to load the rest. I would greatly appreciate any and all suggestions!

    Read the article

  • IF commands in a batch file

    - by Rossaluss
    I'm writing a small batch file to replace users' themes and charts in Office and I have the below batch file that works just fine. cd c:\documents and settings\%username%\application data\microsoft\templates echo Y|rmdir charts /s mkdir charts echo Y|del "c:\documents and settings\%username%\application data\microsoft\templates\document themes\*.*" net use o: \\servername\sms copy "o:\ppt themes\charts\*.*" "c:\documents and settings\%username%\application data\microsoft\templates\charts" copy "o:\ppt themes\Document Themes\*.*" "c:\documents and settings\%username%\application data\microsoft\templates\document themes" c: net use o: /delete Now what I want is the above to only run if it hasn't run before as we'll be pushing this out to all users for around 2 weeks to catch people that aren't in every day. Is there any way to begin the command with something to look for one of the new themes/charts already pushed down, and if it's present, then have it not run? Any help on this would be greatly appreciated as I'm pretty new to these batch files.

    Read the article

  • How to convert an image to a .dwg file

    - by erikric
    My girlfriend is making an art project where she is having an image printed and cut out on a metal plate. The firm responsible for doing this is demanding a .dwg file (and something called polyline; some sort of setting maybe?). Neither of us have heard about this file format, and I find the information about it quite confusing. Most pages seem to link to some schetchy "FooToBarConverter" software, that I frankly don't trust. Could someone please enlighten us on what we need to do, or point to some safe and preferably free software that could do this? (An explanation of the dwg format and the polyline thing would also be much appreciated)

    Read the article

  • FreeNAS - how to "Exclude from file" in Rsyncd (GUI)

    - by user179181
    I am trying to set rsync tasks to Pull user profiles from 11 Windows machines running DeltaCopy Server and then configure ZFS periodic snapshot tasks for a backup solution. So far this has been working fine, although i would like to exclude certain file types like .DAT or NTUSER.DAT. My Exclusion file resides on the local ZFS Dataset (Receiving side) and is as follows: Temp Temporary Internet Files NTUSER.DAT NTUSER.DAT.LOG *.dat *.tmp *.DAT.log *.ost *.pst The command i typed under Auxiliary Parameters (Rsyncd Global Conf under services)is as follows: exclude from = /mnt/Storage/User_Profiles/exclude.txt Ive tried deleting the .DAT files from the receiving end and just as i start to get excited i click refresh and there they are again

    Read the article

< Previous Page | 387 388 389 390 391 392 393 394 395 396 397 398  | Next Page >