Search Results

Search found 2420 results on 97 pages for 'dir 655'.

Page 60/97 | < Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >

  • How to perform a nested mount when using chroot?

    - by user55542
    Note that this question is prompted by the circumstances detailed by me (as Xl1NntniNH7F) in http://www.linuxquestions.org/questions/linux-desktop-74/boot-failure-upon-updating-e2fsprogs-in-ubuntu-10-10-a-947328/. Thus if you could address the underlying cause of the boot failure, I would very much appreciate it. I'm trying to replicate the environment in my ubuntu installation (where the home folder is on a separate partition) in order to run make uninstall. I'm using a live cd. How to mount a dir in one partition (sda2, mounted in ubuntu as the home folder) into a directory on another mounted partition (sda3)? I did chroot /mnt/sda2 but I don't know how to mount sda3 to /home, and my various attempts didn't work. As I am unfamiliar with chroot, my approach could be wrong, so it would be great if you could suggest what I should do, given my circumstances.

    Read the article

  • copy list of video file names to text file along with their runtime?

    - by Adam Johnston
    Copy list of file names to text file? I am linking the above page because of its relevance to my question. Is there anything similar that can be done to output a 'plain' looking text file (or xml or csv file) with basically the same data that the following cmd prompt produces: dir > c:\list.txt However only difference I need is the runtime of any and all video files included in the outputted file. Can this easily be done? Please let me know whether this can be done in the python terminal as well since I am familiar with that was well as Microsoft's dos prompt. Thank you so very much.

    Read the article

  • Someone try to hack my site, want to understand the log

    - by garconcn
    I have a wordpress site hosted on CentOS 6. After see the following access log, I checked the server, it seems ok. Can anyone explain what does this guy trying to do? Did they get what they want? I have disabled allow_url_include, and restricted open_basedir to web dir and tmp(/etc is not in the path). 190.26.208.130 - - [05/Sep/2012:21:24:42 -0700] "POST http://my_ip/?-d%20allow_url_include%3DOn+-d%20auto_prepend_file%3D../../../../../../../../../../../../etc/passwd%00%20-n/?-d%20allow_url_include%3DOn+-d%20auto_prepend_file%3D../../../../../../../../../../../../etc/passwd%00%20-n HTTP/1.1" 200 32656 "-" "Mozilla/5.0"

    Read the article

  • mac os x, find all symbolic links that point to files on a different volume

    - by Eddified
    In my ~ dir, I have some symlinks that point to "/Volumes/Macintosh HD 2/..." and I want to find them all recursively. A look at the man page for 'find' says the '-lname' argument will search the symbolic link contents. It appears to work, but not recursively: $ pwd /Users/myusername $ sudo find . -lname '/Volumes*' $ cd Documents/ $ sudo find . -lname '/Volumes*' ./Documents on Win7 ./work.rtf What's going on? How can I make this work recursively? -- The 'find' program is supposed to always work recursively. I checked perms, they look ok, but as you can see I used "sudo" just to be sure... no dice. $ ls -ld Documents/ drwx------+ 14 myusername staff 476 Jan 12 16:32 Documents/

    Read the article

  • nginx clean url router/rewrites

    - by Janko
    im having difficulties with a relativity simple rewrite rules / router in nginx config. All I want to do is, if requested dir or file 'host/my/request/path[/[index.php]]' does not exist, rewrite to 'host/my/request/path.php' Current rewrite works for: host host/ host/my/request/path But wont work for: host/my/request/path/ Here is the rewrite part of the config: location = /(.*)/ { rewrite ^(.*)$ $1 permanent; } location / { try_files $uri/ $uri $uri.php; } Error log will report: Access forbidden by rule, request: "GET /my/request/path/ HTTP/1.0" Hm, is there a better way to solve this or get rid of the trailing slash? edit, rules more elaborative: host[/] > host/index.php host/index[/] > host/index.php host/my/path[/] > if /path/index.php exists: host/my/path/index.php else host/my/path.php

    Read the article

  • How can I get command prompt to merge my files in name order?

    - by Anastasia
    I'm using the copy command in command prompt to merge all the files in a directory, for a number of directories. The problem is, I need to edit the first file in each directory before I merge. This means that when I put in the command "copy /b *.mp3 name.mp3", the joined file has part 2 at the start and part 1 at the end, presumably because it was created last. Is there a way of using the copy command so that the files merge in name order? Each folder has a different number of parts, anywhere from 2 to 1000 so I don't want to list each file with a "+" in between. Ideally, I'd like to find something to insert into the copy command I'm already using. Otherwise, is there a way of rearranging the files in a folder so that if you enter "DIR", part 1 shows up first even if it was edited last? I'm using Windows 7 by the way.

    Read the article

  • why can't i delete this file on my pc

    - by ooo
    i am trying to install some software and it requires removing the following file: cd %SYSTEMROOT%\assembly\GAC rename Microsoft.Office.Interop.Outlook Microsoft.Office.Interop.Outlook.Old the problem is when i do the rename it looks like its successful but it doesn't seem to actually delete Microsoft.Office.Interop.Outlook i tried to explicitally do: del Microsoft.Office.Interop.Outlook and again it looks like it works but then when i do a "dir" i still see that file. Do you have any suggestions on why this wouldn't allow me to delete this file. I have outlook closed when i am doing this so i wouldn't think outlook is locking the file.

    Read the article

  • some issues with removing www and redirecting index.html

    - by MariaKeys
    Hello Fellas, I am having trouble doing what i want to do with the following setup. I would like to remove all WWW, and also forward index.html to root dir. I would like this to be for all domains, so i am doing inside httpd.conf directory directive. I tried many variations with no success. Latest version is below (domains are inside /var/www/html, in seperate directories). http://www.example.com/index.html > http://example.com http://www.example.com/someother/index.html > http://example.com/someother/ Thanks, Maria <Directory "/var/www/html/*/"> RewriteEngine on RewriteBase / RewriteCond %{HTTP_HOST} ^www\.(.+)$ [NC] RewriteRule ^(.*)$ http://%1/$1 [R=301,L] #RewriteCond %{REQUEST_URI} /^index\.html/ RewriteRule ^(.*)index\.html$ / [R=301,L] Options ExecCGI Includes FollowSymLinks AllowOverride AuthConfig AllowOverride All Order allow,deny Allow from all </Directory>

    Read the article

  • How to know which revision or router I do have? [migrated]

    - by Rosamunda
    I´m trying to update my D-link Dir-600 router with the dd-wrt firmware. I´ve searched for it at the site and found that: Revision A1, B1 and B2 are supported, while C isn´t. Now my router has this information on the back: P/N IIR600GNA .... C1G H/W Ver: C1 F/W Ver: 3.01 So I guess the H/W Ver is the revision, and it´s C... so it´s a lost cause? Or maybe because it´s not just C but C1 I could do something with it? Thanks!

    Read the article

  • UBUNTU 12.10 loaded. after that boot sector changed from win to grub

    - by Rupam Roy
    After installing Ubuntu 12.10, to my pc and giving a patjh in the external HD, its root dir only went into that and all files on the hd of my PC.Now i required the Ex HD everytime to go to either Win or Linux. I deleted the partition made by linux from disk management of Win, and now want to change the boot sector of my HD of PC back to win. Pc is not starting up and showing Grub failure. I have the original win 7 os. I tried with that going to the command line, but what is the command that takes me to DVD. I ve tried 'cd dvd' and 'cd/ dvd'. Plese help.

    Read the article

  • Resources started with slash .htaccess redirection

    - by Pawka
    I have moved old version of webpage to some subdirectory: http://www.smth.com/old/. But all resources (images, css, etc.) in HTML are linked with slash symbol at the start. So browser still tries to load them from root path. For example old/test.html contains: <img src="/images/lma_logo.ico" /> <!-- not working !--> <img src="images/lma_logo.ico" /> <!-- working !--> How can I rewrite ulrs to load resources from the "old" dir if urls still starts with "/"?

    Read the article

  • What is this PHP process? It is crippling my server

    - by user1019588
    This process has been using 65% of my site CPU and has lasted for about 10 minutes now (aren't processes only supposed to go for a couple seconds?) It is obviously something with mysql. This makes sense because I have a lot of queries going, but something still seems a bit odd... This could have something to do with my bad PDO connection that I mentioned in the previous question. Perhaps I am opening too many connections or something like that? Here is the stats on it: Owner: mysql Priority: 0 CPU %: 61.1 Memory %: 0.4 Command:/usr/sbin/mysqld --basedir=/usr --datadir=/var/lib/mysql --plugin-dir=/usr/lib64/mysql/plugin --user=mysql --log-error=/var/lib/mysql/cvps54834319.myhost.com.err --pid-file=/var/lib/mysql/cvps54834319.myhost.com.pid Thanks for any help on this. I have over 10GHZ on my server so this is very concerning to me.

    Read the article

  • Ideal permissions scheme for multiple Apache/PHP sites...

    - by Omega
    I'm hosting multiple sites from one server where each site has it's own user and www directory in their home dir. Currently our web server runs as user nobody(99). We're noticing that to run several popular scripts and engines, they require write access to their own files. As the home directory is owned by the user, not nobody(99), what is the best policy or change in hosting configuration that would: ...make it so that all the various engines and platforms work? ...still allow us to work with files and edit them without having to diddle with permissions as root? Thanks for the advice!

    Read the article

  • Cannot ping a VM from a Hyper-V host

    - by user1688175
    I am facing a weird situation in my network environment. My infrastructure looks like this: I have a D-LINK DIR-635 acting as my default gateway (192.168.0.1) A physical Windows 2012 Server (192.168.0.10) with the following roles: DHCP, DNS, AD DS and Hyper-V. A virtual Windows 2012 Server (192.168.0.50) which I intent to use as an IIS server (Role is not deployed yet). My virtual machine was able to get an IP address from the DHCP server and is working perfectly (I can ping the default gateway [by IP, FQDN or DNS Alias], the Hyper-V host and any site on the Internet (CNN.com for example). However I cannot ping the VM from my host. It says: Request Timed Out. Do you guys know what I might be doing wrong? Any support is appreciated! Thanks!

    Read the article

  • How can I use '{}' to redirect the output of a command run through find's -exec option?

    - by pkaeding
    I am trying to automate an svnadmin dump command for a backup script, and I want to do something like this: find /var/svn/* \( ! -name dir -prune \) -type d -exec svnadmin dump {} > {}.svn \; This seems to work, in that it looks through each svn repository in /var/svn, and runs svnadmin dump on it. However, the second {} in the exec command doesn't get substituted for the name of the directory being processed. It basically just results a single file named {}.svn. I suspect that this is because the shell interprets > to end the find command, and it tries redirecting stdout from that command to the file named {}.svn. Any ideas?

    Read the article

  • Symlink to /Documents /Pictures etc. in OS X Home Directory?

    - by Larry O'Brien
    I have just purchased a 120GB SSD with the intent of making it my boot drive. I'd like to keep it as lean as possible since, y'know, it's so small (Heaven help me). I've read Can I move my home folder in Mac OS X? and Moving Mac OS X user folders? which discourage moving the entire home dir to a data drive. Is it possible and less-dangerous to leave the home directory on the boot drive but move the big data directories to a slower drive and symlink to them? I have the same thoughts with the /Applications directory, but maybe I should make that a separate question?

    Read the article

  • `for` loop of Microsoft `cmd`: how can I process only the files with a certain extension?

    - by uvts_cvs
    I have a the folder c:\test\ and two files in it a.txt and b.txtv. I would like to process just the files with extension equal to .txt. If I write this commands cd c:\test for %f in (*.txt) do echo %f I will get the result where both a.txt and b.txtv are listed. The same happens with cd c:\test dir *.txt It seems .txt is the same of .txtv. I have Windows XP SP3 in Italian and the result of ver is Microsoft Windows XP [Versione 5.1.2600]. The same result is from Windows 7 in English Microsoft Windows XP [Version 6.1.7601].

    Read the article

  • "private" directory not accessible in Apache

    - by janeden
    The directory private lives under my DocumentRoot, and despite its name, it should be accessible just like any other dir. But if I add the following RewriteRule to httpd.conf: RewriteRule ^/([^\.]+)$ /$1.html [L] Apache returns 403 for http://server/private/2201. The error log states client denied by server configuration: /private/2201.html If I then rename private to foo, or if I request 2201.html directly, the file is served: 127.0.0.1 - - [21/Nov/2011:10:24:45 +0100] "GET /private/2201 HTTP/1.1" 403 214 127.0.0.1 - - [21/Nov/2011:10:24:58 +0100] "GET /foo/2201 HTTP/1.1" 200 3068 127.0.0.1 - - [21/Nov/2011:10:27:39 +0100] "GET /private/2201.html HTTP/1.1" 200 3068 This is confusing. Is there any special rule for directories named private? If so – why does the direct request for 2201.html work (although the denied request seems to handle the same resource, at least according to the error log entry)?

    Read the article

  • Sharing a symlinked (`mklink /d`) directory via SMB?

    - by Alois Mahdal
    I have a Windows 7 amd64 box where one directory is shared: local path is d:\drop\ remote path is \\aloism\drop from SMB point of view, Everyone has Read and Write permission ACLs for the folder are set so that all authenticated users have read and write permissions:NT AUTHORITY\Authenticated Users:(OI)(CI)C (which is inherited to all levels below) Now I create a symbolic link within the structure of the directory: D:\drop>mklink /d tools2 tools symbolic link created for tools2 <<===>> tools The problem is that I can't access this new directory from any of the remote machines (a Windows 7 box and a Windows XP box—both behave the same way): C:\>dir \\aloism\drop\tools2\ Volume in drive \\aloism\drop is droot Volume Serial Number is FA73-1897 Directory of \\aloism\drop\tools2 File Not Found How can I make it work? Possibly also for files?

    Read the article

  • Emacs: Prevent TRAMP connection closing

    - by Josh
    I'm using emacs with TRAMP ( C-x C-f /ftp:[email protected]:/ ), and randomly, sometimes ten minutes, sometimes, ~12 seconds (no exaggerating) my connection will close (I think). I'll try to type, or list a dir, and it will say "Opening FTP connection to site.com...". Is there a way to tell it to just keep the connection open until I exit? Or is it the webserver killing the connection? I'm just using standard FTP. Thanks.

    Read the article

  • Is it possible for a router to "go bad" with time?

    - by JQAn
    I've been having problems with my internet connection over the past weeks (intermittent disconnections, slow transfers, etc), and my provider keeps telling me that the problem is not on their end. I have cablemodem with a wifi router (this router was not provided by them). The router is quite old (DIR-300), so I'm starting to wonder if it could be the issue and if I should replace it. Is it possible that it is the cause? Can they become so outdated that they cause intermittent interruptions of service? If I reset the modem and the router, they work fine for a few hours, but the problems starts again after a while.

    Read the article

  • SFTP sending files between laptops on Ubuntu

    - by twigg
    I want to transfer files between two Ubuntu systems using SFTP. I have got it set-up and I can connect to the other laptop, ping it and see its file list using sftp> dir. I can see the files on the other system. But when I call get filename.deb it comes up saying Fetching /home/user/filename.deb to filename.deb 0% 0 0.0KB/s --:-- ETA and then drops back to the sftp command promote without transferring anything. Have I missed something?

    Read the article

  • RUNIT - created first service directory, "sv start testrun" does not work

    - by Veseliq
    I'm pretty new to runit. I installed it on a Ubuntu host. What I did: 1) created a dir testrun in /etc/sv 2) created a script run in /etc/sv/testrun/run, the script content: #! /bin/bash exec /root/FP/annotate-output python /root/FP/test.py | logger -t svtest 3) If I call directly /etc/sv/testrun/run it executes successfully 4) I run sv start testrun (or sv run testrun, sv restart testrun), all of them end up with the same error msg: fail: sv: unable to change to service directory: file does not exist Any ideas what am I doing wrong? I'm new to runit and base all my actions on the information found here: http://smarden.org/runit/

    Read the article

  • Persistent retrying resuming downloads with curl

    - by Svish
    I'm on a mac and have a list of files I would like to download from an ftp server. The connection is a bit buggy so I want it to retry and resume if connection is dropped. I know I can do this with wget, but unfortunately Mac OS X doesn't come with wget. I could install it, but to do that (unless I have missed something) I need to install XCode and MacPorts first, which I would like to avoid. Curl is available though it seems, but I don't know how that works or how to use it really. If I have a list of files in a text file (one full path per line, like ftp://user:pass@server/dir/file1) how can I use curl to download all those files? And can I get curl to never give up? Like, retry infinitely and resume downloads where it left off and such?

    Read the article

  • How to compare differences between directories (linux)

    - by Phil
    I have two directories - one from earlier backup and second from newest backup. How do i compare what changes were made to files in directory from newest backup on Linux? Also how do i display changes in for example text and php files - i'm thinking about something like revision history on wikipedia where you see old version on one side of the screen and newest version on other and changes are highlighted. How do i achieve something like that? edit: How do i also compare remote dir with local?

    Read the article

< Previous Page | 56 57 58 59 60 61 62 63 64 65 66 67  | Next Page >