Search Results

Search found 81445 results on 3258 pages for 'file command'.

Page 974/3258 | < Previous Page | 970 971 972 973 974 975 976 977 978 979 980 981  | Next Page >

  • files appearing empty or only partially transferred on FTP server.

    - by james
    firstly, apologies if this question has been asked and answered before, but I have had a look through related queries and found nothing identical to this. I have had a website for a few years, and have never had any problem uploading any files. But today, when i went to transfer a new html file onto the server...i did so, the file arrived. so i browsed to the file in my browser to check the page as i always do, and the browser wouldn't acknowledge it. after repeated attempts to transfer it, it finally seemed to go over, but only 1/4 of the file size..4KB out of 16KB...so only the top of the page would be viewable in my browser...ive tried transferring on a number of ftp clients and no love... my expertise on this is limited and i cant really think of the next step, the server isnt full, so...im just stumped. any ideas? any and all feedback is great appreciated.

    Read the article

  • BIND9 Forwarding by view

    - by Triztian
    Hi I think this is a simple issue, I'd like to forward only to certain IPs in the LAN network, for example I have 2 acl lists: acl "office1" { 192.168.1.15; // With internet access }; acl "production" { 192.168.1.101; // No internet access }; I know that there probably should be more efficient ways to restrict internet access, but at the moment this is what I'd like to try.Here's what I've tried in named.conf.local // Inlcude my acl definitions include "/etc/bind/acls.conf"; view "no-internet" { match-clients { production; }; include "/etc/bind/named.conf.default-zones"; zone "localdomain.com" { type master; file "/etc/bind/db.localdomain.com"; }; zone "1.168.192.in-addr.arpa" { type master; file "/etc/bind/db.192.168.1"; }; } view "internet" { match-clients { office1; }; include "/etc/bind/named.conf.default-zones"; forwarders { 201.56.59.14; // Made Up 201.56.59.15; // Made Up }; zone "localdomain.com" { type master; file "/etc/bind/db.localdomain.com"; }; zone "1.168.192.in-addr.arpa" { type master; file "/etc/bind/db.192.168.1"; }; }; As you can see I want a localdomain.com defined for every computer in my network and forward internet access to the computers in the office but not to the ones on the production floor. I've modified my conf file, however the IP in the "no-internet" acl is able to resolve the domains, even though I've rebooted the computer, flushed the DNS using ipconfig /flushdns and set my DNS Server as the only one, why is this still happening? Thanks in advance.

    Read the article

  • Is it possible to change User's Home Directorys permission in OSX?

    - by Sosiska
    Most of your staff uses OSX as main operation system. The problem is that recently we were attacked with some odd malware: users are getting zip-file via mail, and when they open this zip file, they execute a binary keylogger malware, that is inside this zipped file. (One click is enough). We have some non-technical limitations and due this limitation we can't configure user's mail servers. But actually we have physical access to their laptops. As far as I know, there is possible to mount user's home directory without "x" (execution) permission in Linux and *BSD. So users can't run some binary file inside home directory. Is it possible to configure OS X so that user can't execute files inside /Users/?

    Read the article

  • Apache not serving pages stored in Subversion repository

    - by Stephen
    I've setup Apache and Subversion on an old PC, but Apache is not serving pages correctly, when I enter the address to my test site: http://HOME_IP_ADDRESS/test/index.html I just get a File Not Found error and the following output in the error log: File does not exist: /var/www/html/svn/repos/test but I know the file exists, when I enter the following URL into the browser: http://HOME_IP_ADDRESS/repos/test/index.html I just get a listing of the HTML. In my Apache config file I have the Document Root set as follows: DocumentRoot "/var/www/html/svn/repos" so I'm not sure what is going on, I have SVN installed and I think it may have something to do this. Edit * I changed the Document Root location, which helped as pages in the new location were served correctly, so the problem is with just serving the pages from the repository.

    Read the article

  • What does this error mean (Can't create TCP/IP socket (24))?

    - by user105196
    I have web server with OS RHEL 6.2 and Mysql 5.5.23 on another server and the web server can read from Mysql server without problem, but some time I got this error: [Sun Sep 23 06:13:07 2012] [error] [client XXXXX] DBI connect('XXXX:192.168.1.2:3306','XXX',...) failed: Can't create TCP/IP socket (24) at /var/www/html/file.pm line 199. my question : What does this error mean (Can't create TCP/IP socket (24))? is it OS error or Mysql error ? perl -v This is perl, v5.10.1 (*) built for x86_64-linux-thread-multi mysql -V mysql Ver 14.14 Distrib 5.5.23, for Linux (x86_64) using readline 5.1 su - mysql -s /bin/bash -c 'ulimit -a' core file size (blocks, -c) 0 data seg size (kbytes, -d) unlimited scheduling priority (-e) 0 file size (blocks, -f) unlimited pending signals (-i) 127220 max locked memory (kbytes, -l) 64 max memory size (kbytes, -m) unlimited open files (-n) 1024 pipe size (512 bytes, -p) 8 POSIX message queues (bytes, -q) 819200 real-time priority (-r) 0 stack size (kbytes, -s) 10240 cpu time (seconds, -t) unlimited max user processes (-u) 1024 virtual memory (kbytes, -v) unlimited file locks (-x) unlimited

    Read the article

  • Measuring custom statistics with sar

    - by Will Glass
    I have a server application which I think is leaking file handles. I want to track the usage of file descriptors over time on my Linux (ubuntu) server. I've figured out that I can track the number of file descriptors in use by a process with lsof -p `pgrep the-process-name` | wc -l Since I'm already using sysstat and sar to track various metrics, I thought it'd be nice to display with sar. I want to measure this every 10 minutes. Is it possible to add a custom metric to sar? Then I can easily report it out. If not, I'll write a simple cron job to collect this data and store it separately in a log file.

    Read the article

  • psql editor setting on Ubuntu

    - by dezso
    The situation is the following. This is an Ubuntu box: Linux ns3mx3 2.6.32-41-server #89-Ubuntu SMP Fri Apr 27 22:33:31 UTC 2012 x86_64 GNU/Linux Which means that when I first issue \e in psql, I'm asked to choose an editor. Then there is the .selected_editor file, which contains # Generated by /usr/bin/select-editor SELECTED_EDITOR="/usr/bin/mcedit-debian" So far this is OK (it's my problem that I consider this completely useless, but never mind). Then I set up a .psqlrc file: \set PSQL_EDITOR /usr/bin/vim \set EDITOR /usr/bin/vim \set VISUAL /usr/bin/vim As you can see, I wanted to be sure not to miss a candidate variable for editor setting. The file is used as expected: test=# \echo :EDITOR /usr/bin/vim But when I issue the \e command, none of these is used - I fall back to SELECTED_EDITOR. The situation remains just the same if I append an \unset SELECTED_EDITOR to the .psqlrc file. Now how can I make .psqlrc setting win over the default editor? (PostgreSQL version id 9.1.4)

    Read the article

  • Edit write-protected files by breaking hard links

    - by Taymon
    A directory which I own and can write to contains hard links to files that I don't own and don't have write permission for. I want to open and edit these files in Emacs. When I save my changes, Emacs should rename the existing hard link by appending ~, then write my new version of the file as a new file owned by me. I was under the impression that Emacs could just do this (because of the way it does backups), but it's not working; when I save, it attempts to change the file's permissions in order to write to it (and fails because I don't own the file). How do I make this happen?

    Read the article

  • df -h overreports disk space on VPS

    - by Rincewind42
    When I run the command df -h on my new Ubuntu linux vServer I get the following: # df -h Filesystem Size Used Avail Use% Mounted on /dev/hdv1 466G 33G 434G 7% / none 16M 0 16M 0% /tmp Running du -sh gives # du -sh du: cannot access `./proc/13624/task/13624/fd/4': No such file or directory du: cannot access `./proc/13624/task/13624/fdinfo/4': No such file or directory du: cannot access `./proc/13624/fd/4': No such file or directory du: cannot access `./proc/13624/fdinfo/4': No such file or directory 952M . The VPS should only have 5Gb of disk space but df reports 466Gb. How can I view the correct amount of disk space?

    Read the article

  • IIS_IUSRS cannot access files uploaded and created by Network Service - error 401.3

    - by Max
    Let me rephrase my question as I investigated further: The problem: I have a php script that is used to upload images on my windows webserver 2008. The files are created in the correct directory. The are created and owned by the user Network Service. Network Service has full access to the uploaded file. As soon as I try to access the uploaded file (mostly an image) via HTTP, I get an 401.3 not authorized error. Now, if I right-click on the not accessible image and grant IIS_IUSRS group read permissions via the security tab, the image can be accessed! By default IIS_IUSRS has NO access at all for the uploaded file. The directory containing the image files has the correct access rights set. But each file that is new uploaded to the directory is permitted for IIS_IUSRS. The question: How can I grant IIS_IUSRS by default access to the newly uploaded file? The appPool of the website has its identity set to its default, I also tried setting it to "networkIdentity" or so, but that did not work either.

    Read the article

  • Dump Trac DB on Windows/XAMPP

    - by Whiteknight
    I have a Trac instance running on a WindowsXP machine with XAMPP. I am trying to migrate the trac instance to a newer Linux-based machine. However, I'm having a hard time getting the database to cooperate. I try to dump the db with this command: sqlite3 C:\tracroot\db\trac.db ".dump" >> mysqldump.sql But the generated file is mostly empty: BEGIN TRANSACTION; COMMIT; So that's not right. For the record my trac instance is running now and appears to have full access to all the contents of the DB. But sqlite3 (located in C:\xampp\apache\bin) can't seem to get any information from the file. The DB file itself has the header "SQLite format 3", so that seems to be correct. I need to know one of two things: How to get this dump working OR An alternate way to migrate the Trac database to the new machine. Update: When I try to open the .db file in sqlite3, I get the error Error: unsupported file format. What format is it in, and why is it unsupported?

    Read the article

  • FFmpeg add multiple audio files to video at specific points

    - by Arran
    I have two audio files, each about 3 minutes long. I want to take the first 10 seconds of each file and add them each to a video file at specific points - 0 seconds and 10 seconds. So the resulting video should be 20 seconds long. I've got this far: ffmpeg -i video.mov -ss 0 -t 20 -itsoffset 0 -i audio1.mp3 -itsoffset 10 -i audio2.mp3 -acodec copy -vcodec copy out.mov ...but the resulting video has 20 seconds of the first audio file only, the second audio file doesn't start at 10 seconds like it should. Any help would be appreciated, thanks!

    Read the article

  • How to view big files on Windows?

    - by user20988
    Sometimes we need to view large files - 30M-100M. Usually we use FAR viewer for this. Sometimes we need to copy to clipboard long traces from this file. But it is possible to copy only one screen in FAR viewer. What can be used for this purpose? It should be GUI and freeware. UPDATE: We need to have ability navigate over the file and see updates of the file in the meantime (eg tail -f)

    Read the article

  • Need to install just libswresample.so.0 on Centos 6 - how? (for FFMPEG)

    - by sprise
    I'm trying to get ffmpeg running on a Centos 6 machine and it has been uphill the whole way. I thought I had got it but when I go to use ffmpeg I get the error: ffmpeg: error while loading shared libraries: libswresample.so.0: cannot open shared object file: No such file or directory I looked in /usr/local/lib, which is where all the libraries are stored, and I do not have that exact file but I do have "libswresample.a". I gave up on the official FFMPEG Centos directions due to all kinds of issues and used yum to install. Where do I find the missing library and can I just put the file in my /usr/local/lib to fix? Thanks -- have a basic Linux understanding, more familiar with Ubuntu than Centos.

    Read the article

  • How does Linux's unlink on a NTFS filesystem differs from Window's own implementation?

    - by DavideRossi
    I have an external USB disk with an NTFS filesystem on it. If I remove a file from Windows and I run one of the several "undelete" utilities (say, TestDisk) I can easily recover the file (because "it's still there but it's marked as deleted"). If I remove the file from Linux (I'm using Ubuntu) no utility can recover the file (unless I use a deep-search signature-based one). Why? It looks like Linux does not just "mark it as deleted" but it wipes away some on-disk structure, is this the case?

    Read the article

  • AWSTATS - manual update error (permissions)

    - by Lewis
    Error: Couldn't open file "/var/www/awstats/awstats032014.site.net.tmp.9198" for write: Permission denied Setup ('/etc/awstats/awstats.site.net.conf' file, web server or permissions) may be wrong. Check config file, permissions and AWStats documentation (in 'docs' directory). I get this error when manual trying to update awstats (via the browser link). I have set the folder permissions of /var/www/awstats/ to 775 and still get the error. If I create a new file on that folder the default permission setting set the permissions to 774 which should work.

    Read the article

  • Is NFS capable of preserving order of operations?

    - by JustJeff
    I have a diskless host 'A', that has a directory NFS mounted on server 'B'. A process on A writes to two files F1 and F2 in that directory, and a process on B monitors these files for changes. Assume that B polls for changes faster than A is expected to make them. Process A seeks the head of the files, writes data, and flushes. Process B seeks the head of the files and does reads. Are there any guarantees about how the order of the changes performed by A will be detected at B? Specifically, if A alternately writes to one file, and then the other, is it reasonable to expect that B will notice alternating changes to F1 and F2? Or could B conceivably detect a series of changes on F1 and then a series on F2? I know there are a lot of assumptions embedded in the question. For instance, I am virtually certain that, even operating on just one file, if A performs 100 operations on the file, B may see a smaller number of changes that give the same result, due to NFS caching some of the actions on A before they are communicated to B. And of course there would be issues with concurrent file access even if NFS weren't involved and both the reading and the writing process were running on the same real file system. The reason I'm even putting the question up here is that it seems like most of the time, the setup described above does detect the changes at B in the same order they are made at A, but that occasionally some events come through in transposed order. So, is it worth trying to make this work? Is there some way to tune NFS to make it work, perhaps cache settings or something? Or is fine-grained behavior like this just too much expect from NFS?

    Read the article

  • Access denied for user who has full access to some files in their own folder

    - by steve02a
    I have a very similar case as this user: Access denied on some files on Win2008R2 DC share This is on a windows 2008 R2. The user has Win7 pro. The user has their own home folder on the server. Every file, except one, the user can read/write/modify at their own will. No problems - except this one file. She gets "access denied" I can open it (as domain admin). Another user can open it (because she's in the domain admin group). I did run the AccessEnum tool and the read/write permissions are all identical for all files. So, I can't explain why the user can't open this one single file. Out of all her files in sub-folders and such. No problems. This one file is causing a headache. What do you think could be wrong here?

    Read the article

  • Dialog in linux

    - by user35319
    Hi everyone, I want to show the contents of file on Dialog box for which i have use the "--textbox" dialog and "--tailbox" dialog but it dont show the whole contents of file just show some data not the whole data of file...i searched alot but found nothing so if anyone have any idea plz let me know bcoz i have been trying so much to fix the problem...

    Read the article

  • Mercurial no longer working in NetBeans?

    - by John Isaacks
    I have been using Mercurial inside NetBeans for a while now. For the past few weeks now Windows Explorer has been crashing for various reasons, including every time I right click. Finally someone suggested I try uninstalling TourtoiseSVN and TortoiseHG since they affect Windows Explorer directly. I uninstalled both last week since I don't ever use them (I either user the NetBeans interface or use the command line interface.) Since then my Windows Explorer stopped crashing. Today I noticed that in NetBeans it is no longer showing me any of the Mercurial features. I pull up a command prompt and its not working their either. It seems like uninstalling TortoiseHG also uninstalled Mercurial altogether, which was not intended. I went to http://mercurial.selenic.com/wiki/Download to download the 64bit 1.8.4 .exe version of Mercurial for Windows. I installed it, and I can now use the command line. However it is still now working in NetBeans. Does NetBeans require Tortoise to work? Is there something else I am missing?

    Read the article

  • How do you stop windows 7 from auto streaming mp3's online?

    - by angryuser
    Be it IE, Firefox or Chrome whenever I try to download a media file windows 7 starts streaming it in the browser instead of giving me options about what I want to do with the file i'm trying to download. I know the problem is with the OS and not the browser because I can download the file just fine off the website when I use Ubuntu. I get the feeling somewhere a setting is saying "open all mp3's in browser" but I dont know where to find or change it. Can anyone help? Edit: If I click on the FLAC version of the audio file, windows 7 automatically downloads it. If I click on the MP3 version, it automatically streams it to the browser.

    Read the article

  • With Ubuntu 12.04 unlike 11.04 Wine installed application start menu links are missing

    - by Ron Whites
    With Ubuntu 12.04 and wine 1.4 unlike ubuntu 11.04 with wine 1.2.2 installed application start menu links are missing. For instance with Ubuntu 11.04 including Wine I can install one of our Windows applications and then can go to Applications Wine Programs Semantic Designs TestCoverage Documentation to bring up the documentation for how to run our tool. Unfortunately with Ubuntu 12.04 the Applications menu is gone and going to Dash I do see "Recent Apps and more apps" but my installed Wine application and related documentation link is shown present, even though the wine uninstaller shows in present. I found this online suggestion and tried using the gnome "main menu"... Windows key to launch the Dash. Enter "Main Menu" in the search field and open the old Edit Main Menu app. Select the Category (aka Unity Dash Filter) you want the item in. Name the Dash/Launcher Item Add the Command to launch said app With "mainmenu" then get down to the TestCoverage Documentation and I could see a command link in properties of .. env WINEPREFIX="/home/sdtest/.wine" wine C:\windows\command\start.exe /Unix /home/sdtest/.wine/dosdevices/c:/users/sdtest/Start\ Menu/Programs/Semantic\ Designs/Test\ Coverage/Java\ 1.7\ Documentation.lnk BUT I could not execute this link to view the installed documentation. So I copied the link properties into a file, set it as executable, and ran it as a bash script and the documentation came up! So why can't I use this link under main menu?

    Read the article

  • Iptables and system-config-firewall

    - by nivde92
    I had a set of netfilter rules set with iptables, but someone else told me to use system-config-firewall to add a rule for sharing files with Windows. (Samba) This rewrote the iptables rules file and I lost my own custom rules. I have a backup copy, but am having trouble restoring them. Edit: The server is Centos, I already tried to restore the rules with iptables-restore < /root/working.iptables.rules but for some reason the rules don't change. What are you trying to do? Trying to restore the iptable rules that I have in a backup file. What have you tried in order to make it happen? I've tried to modify the iptables file with vim, since the command iptables-restore was no help. What results did you expect? To get the old rules back. What actually happened? Nothing, when I run the command or edit the file by hand the file doesn't change at all. Maybe something else it's overwriting.

    Read the article

  • How to diagnose computer freezing problem

    - by reinierpost
    I have a laptop (a Medion from Aldi) that tends to hang quite often - so often, in fact, that several attempts to install Windows XP or Ubuntu on it have all failed. However, I am able to boot and run Ubuntu as found on the standard Ubuntu 10.10 installation image. I have done this two times thus far. The first time everything was running smoothly, until at some point the GUI (i.e. X) became unresponsive. The cursor kept moving with the mouse, but menus would no longer show and clicking things no longer produced any response. So I switched to the consoles (Ctrl-F1, Ctrl-F2, etc., which in this setup automatically run shells. The shells were still responsive, and the cd command would still work, but any command that invoked an executable (e.g. /bin/ls or cd /bin; ./find caused the shell to hang up uninterruptibly. My hypothesis was that all attempts at disk access were hanging up, but I didn't actually try a command like echo /proc/$$ or while read line; do echo $line; done < /var/log/syslog to verify this. Another possibility is that an essential system library is cached in memory and somehow failing to function properly. The second time I left the system running overnight and it didn't hang itself spontaneously. I'm not sure I have the patience to just twiddle with the running system until the condition reappears, and I'm, not sure what to do once it does. Clearly we can rule out a software cause. It seems disk access related, but clearly it's not permanent hard disk failure because the system will reboot just fine. What kind of hardware problem might produce these symptoms? Can it be a memory problem?

    Read the article

< Previous Page | 970 971 972 973 974 975 976 977 978 979 980 981  | Next Page >