Search Results

Search found 17278 results on 692 pages for 'directory conventions'.

Page 508/692 | < Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >

  • Regex working in RedHat is not giving any result in Ubuntu

    - by Supratik
    My goal is to match specific files from specific sub directories. I have the following folder structure `-- data |-- a |-- a.txt |-- b |-- b.txt |-- c |-- c.txt |-- d |-- d.txt |-- e |-- e.txt |-- org-1 | |-- a.org | |-- b.org | |-- org.txt | |-- user-0 | | |-- a.txt | | |-- b.txt I am trying to list the files only inside the data directory. I am able to get the correct result using the following command in RHEL find ./testdir/ -iwholename "*/data/[!/].txt" a.txt b.txt c.txt d.txt e.txt If I run the same command in Ubuntu it is not working. Can anyone please tell me why it is not working in Ubuntu ?

    Read the article

  • Best alternatives to recover lost directories in FAT32 external hard drive?

    - by Sergio
    I have an 320 GB ADATA CH91 external hard drive. I guess it has some problems with the connector of the USB jack. The point is that in certain occasions it fails in write operations generating data losses. Right now I lost a directory with several GB's of very useful information. Since then I have not attempted to write to the disk any more. What tool would you recommend to recover the lost data? The disk is FAT32 formatted (only one partition) and I use both Linux and Windows. What filesystem format would you recommend to avoid future data losses? I currently only use this external hard drive in Linux so there are several available choices (FAT, NTFS, ext3, ext4, reiser, etc.).

    Read the article

  • Network Performance issue

    - by qubemarker
    We have three Ubuntu 10.04 servers. One server is a storage server and the other two servers are configured as clients. The storage server has a good amount of capacity and it is integrated with windows Active directory server for Authentication. I am uploading some video files from both clients to the server and when I am uploading data from any one client alone I get about 26 MB/s data transfer rate. When I upload data from both the clients simultaneously I am only getting about 8 MB/s from each client. I have gigabit ethernet cards in all of the servers and a L2 Managed gigabit switch for connectivity. I don’t know why the data transfer rate is decreasing so much in simultaneous read and write. I have tried all of the TCP stack related settings suggested here. Can any assist with getting better read/write performance out of this setup? Any help is appreciated.

    Read the article

  • Can you remap "C:\Program Files" like you can with "My Documents"?

    - by Danny
    I'm not sure if this is possible, but I'm hoping you guys will know one way or the other! I'm going to be reinstalling windows xp, and the primary master IDE is a smaller 10 gig drive. I'm pretty sure that if I tried to install all my programs back onto the C:\ drive that they'd not all fit. Is it possible to get my Program Files directory to point to a partition on one of my larger drives, so I don't end up with some of my programs on C:\ and others on D:\, E:\, etc?

    Read the article

  • Git completion __git_ps1 really slow on Mac

    - by mckeed
    I've had __git_ps1 in my bash prompt for a while, but just recently (I noticed it after I did some messing around with Homebrew and rbenv), it has slowed down my prompt horribly. When I'm in a git directory I have to wait 3-4 seconds after every command for the prompt to appear. If I just mash return and watch the Activity Monitor, it shows that distnoted and Finder are using more CPU than normal during the delay. Could something git-completion.bash is doing be triggering a notification to Finder? Maybe it involves folder actions or something?

    Read the article

  • Installing software from source

    - by Learning
    I'm trying to understand the rational behind installing from software from source in Ubuntu 12.04. Obviously, I know I can download what I need from repos, but I want to develop a deeper understanding of Linux. As a Windows users, when I download a program I double click it, and it installs into the program files directory unless I specify otherwise. When I want to uninstall it, Windows has a tools that does that for me. When I install a program in Linux, where does it install to? How do I uninstall it afterwards? Are there residual files left over? How would I tell if it's been fully removed? For instance, I'm going to install LMMS (Linux Multimedia Studio) from source. I download and decompress the tar ball, and have a folder name lmms_XXX.tar (whatever). The folder is now on my desktop with an install file. If I run the install file from that location, does it install into that folder? If so, can I move that folder to where ever I want? I want thinking about putting it in /opt/lmms

    Read the article

  • Gesture Based NetBeans Tip Infrastructure

    - by Geertjan
    All/most/many gestures you make in NetBeans IDE are recorded in an XML file in your user directory, "var/log/uigestures", which is what makes the Key Promoter I outlined yesterday possible. The idea behind it is for analysis to be made possible, when you periodically pass the gestures data back to the NetBeans team. See http://statistics.netbeans.org for details. Since the gestures in the 'uigestures' file are identifiable by distinct loggers and other parameters, there's no end to the interesting things that one is able to do with it. While the NetBeans team can see which gestures are done most frequently, e.g., which kinds of projects are created most often, thus helping in prioritizing new features and bugs, etc, you as the user can, depending on who and how the initiative is taken, directly benefit from your collected data, too. Tim Boudreau, in a recent article, mentioned the usefulness of hippie completion. So, imagine that whenever you use code completion, a tip were to appear reminding you about hippie completion. And then you'd be able to choose whether you'd like to see the tip again or not, etc, i.e., customize the frequency of tips and the types of tips you'd like to be shown. And then, it could be taken a step further. The tip plugin could be set up in such a way that anyone would be able to register new tips per gesture. For example, maybe you have something very interesting to share about code completion in NetBeans. So, you'd create your own plugin in which there'd be an HTML file containing the text you'd like to have displayed whenever you (or your team members, or your students, maybe?) use code completion. Then you'd register that HTML file in plugin's layer file, in a subfolder dedicated to the specific gesture that you're interested in commenting on. The same is true, not just for NetBeans IDE, but for anyone creating their applications on top of the NetBeans Platform, of course.

    Read the article

  • Why do some actions not work with Remote Desktop?

    - by Holgerwa
    I usually connect to other PCs in the same building using Remote Desktop, which works great. For some reason, some actions cannot be performed through Remote Desktop. These are, for example: Installation of certain software Accessing the directory of a DVD (that is inserted at the remote computers drive) several other tasks that just "don't react or start", unless you do the same thing without RDP All these actions work with any other remote access tool, like VNC, Teamviewer, LogMeIn, etc. My question is: What is the difference when I use a computer through RDP instead of directly? Is there a list of prohibited actions available so that one could know upfront if something can be done with RDP or not?

    Read the article

  • Fully Qualified Domain name on Ubuntu Server

    - by Fazal
    I've setup a LAMP server on Ubuntu 10.04 (lucid) and have also installed Virtualmin. This is my first attempt at setting up a server of any sort. I set up one virtual host using Virtualmin and so far so good. Some odd things are happening though, such as when I type in my primary domain into a browser, I see the contents of the virtual server instead of what should be in the default directory. I'm going to use 123.345.789 and example.co.uk instead of my actual ip and domain name's if thats ok. I checked my hostname by using hostname -f and got production1 as my response The contents of my /etc/hosts file is (ip's and domain changed to something generic for this post): 127.0.0.1 localhost localhost.localdomain 123.456.789 production1.example.co.uk 123.456.789 production1 shouldn't my FQD be production1.example.co.uk? How can I go about changing this?, a simple step by step instruction would be great! thanks in advance.

    Read the article

  • How to synchronize a whole Ubuntu?

    - by Avio
    I think that the time is ripe to have my whole Ubuntu synchronized just as my Dropbox folder is. Given that we are always talking about files and directories, what's the difference between my Documents folder and my /usr system directory? Almost none, except for their location. In fact, I think that there is just one big issue that prevents people to have their beloved installations mirrored wherever they go: symlinks. Dropbox, Google Drive, Ubuntu One, Sugarsync, Skydrive, none of these services support symlinking. This means that if I push a symlink in one of the synced folders, locally the symlink is kept as is, but remotely (in the cloud or on the other synced machines) the symlink is resolved to the actual file that was originally pointed to. This completely disrupts Linux installations, thus these services can't be used for this purpose. So the question is. Does anybody knows a way to achieve this? A whole Ubuntu, always synchronized with a remote running copy, but still locally stored on both disks? My best guess is that I could use NFS. But the main difference between Dropbox and NFS is that NFS is a remote filesystem that always forces to remotely access the files, while Dropbox pushes modifcations to local filesystems (and thus would perform better). I've also heard about NFS caching. Does anybody knows if this solution could approximate Dropbox in this sense? P.s. I know that /boot, /dev, /proc, /run, /tmp and device-specific mountpoints in /mnt and /media will have to be left out the sync mechanism. What I'm interested in is the principle. Can this be done with reasonable performance, having reasonable resources (e.g. ~ 1Mbps upload bandwidth and a public IP address)?

    Read the article

  • How to prevent nginx from appending the location to root? [duplicate]

    - by simonszu
    This question already has an answer here: nginx location pathing issue 2 answers I want to serve an Icinga Webview via nginx. This webview should be accessible via myserver.com/icinga (as the debian autoconfig for apache will do). I have the following lines in my nginx config: location /icinga { root /usr/share/icinga/htdocs; index index.html; auth_basic "Restricted"; auth_basic_user_file /etc/icinga/htpasswd.users; } However, i get an error 404 and a log entry that says: *10 open() "/usr/share/icinga/htdocs/icinga" failed (2: No such file or directory), So it seems that nginx appends the location value to the root value. I think i figured it out how to prevent this some time ago, but i did not document it for myself and have forgotten how to do it. And now i can't fix it for myself. Can you tell me how to prevent this behaviour?

    Read the article

  • Facing difficulty with migrating from wordpress to Drupal

    - by rakibtg
    One of my blog was build of Wordpress but now i want to use Drupal as the CMS of my Blog. To do so I have deleted all the Wordpress files from my server and the Database and MySQL user which are associated with wordpress blog and uploaded the Drupal files in my server directory where the wordpress files were. But, when i have opened the blog it shows the Wordpress blog! though its been deleted and their should be the Drupal Installation interface. So, i have re-checked my server directories and database, there is not wordpress files and wp database all are deleted, there is only the drupal files, but when i go to the blog to install drupal there is still the Wordpress blog, I have checked the blog in many web browsers and there is not cache memory problem. My hosting server is linux based. can't understand what to do? Any idea? Thanks

    Read the article

  • Can I install ConsoleZ without a package manager?

    - by TheGrapeBeyond
    I am not sure why/how, but I can't seem to simply install ConsoleZ on my Windows 7 computer. I went here, got the latest x64 release, and unzipped it. Afer unzipping it, I get just one directory, that looks like this Now I simply double click on Console.exe. This, however, gives me a very 'boring' looking console, and actually says 'Console2' at the top, not ConsoleZ. This is the first point that confuses me... what is going on here? The other .exe (ConsoleWow) doesn't run anything if I click it. So I Googled around some more, and found that I can get ConsoleZ, but from a package manager called 'Chocolatey'. This is from here. I have not tried that yet, (should I have to?), but this is another possible way. But I do not understand how/why my first attempt from above doesnt work. Where is the 'ConsoleZ'??

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • Data unaccessible from WHS drive

    - by Eakraly
    Hi all, I had a Windows Home server machine that crashed. Before doing anything I decided to get data out of it. So I took the hard drive and connected it to Windows7 pc. What I get is that I cannot access almost any file! I do see directory structure, I can open small files like .txt and .ini but bigger files like .iso and video - no go. Same goes for Ubuntu and OS X - I can see files and even copy them - but they are corrupted. Any ideas for what the problem is?

    Read the article

  • How do I install Visual Studio 2010 Express somewhere besides C:?

    - by TwentyMiles
    I have a SSD as my primary (C:) drive, mainly used for quickly loading games. It's pretty small (~30 GB) so I want to keep things that don't really need a speed boost off of it. I attempted installing the Visual Studio 2010 Express beta last night, and It claimed to require 2.1 GB of space so I changed the install directory to a secondary, non-SSD drive. After this, the installer said that it would use 1.8 GB on C: and ~200 MB on the secondary drive. While this token gesture of moving 1/10 of the app to the place I told it to is cute, I really want to install everything I can to the secondary drive. Is there any way to install all of Visual Studio 2010 Express to a drive besides C:?

    Read the article

  • Problem Writing to Samba Share

    - by Chris
    Hello, I have had a problem writing to a Samba Share I believe you have the answer in this post that you posted in October. Can you tell me how to do this? Thank you very much "On the Samba server, you need to ensure that the nobody user has write permissions to /Windows_Backups/DC. You're forcing everyone to be impersonated by the nobody account, so that account will need file-level permissions on that share directory. Samba will respect local permissions when figuring out who can write where, in this case it is somewhat like Windows."

    Read the article

  • OS X - Automatically Set Execute Permissions for New Files?

    - by i help X u
    I'm using OS X 10.6.4 and am trying to set a folder to automatically enable execute permissions on new script files copied or created in a directory. I have used Sandbox 2 to set every permission for the folder to enabled with sticky bits and the inherit flag set but I still have to manually set the execute flag using chmod for every new flag. I've done: chmod -R a+rwxs ~/scripts I've done: chmod 7777 ~/scripts And the permissions for the folder show as: drwsrwsrwt+ for the folder. But if I add a new script file it's set to "-rw-r--r--+" (the default) I looked at setting "unmask 000" in the .profile file but the default value for files is 666 with an unmask of 022 so that's not relevant since I would need a default value of 777 for files. I have figure out how to use chmod in an AppleScript triggered by a folder action to automate this but I'm wondering if there is a simple ACL or chmod setting I'm missing. So, is there a way to automatically set execute permission for new files? (Without using a folder action and AppleScript?)

    Read the article

  • Workaround broken sudo?

    - by perreal
    I managed to break sudo by deleting the libc.so.6 sym-link in /lib. I copied the actual file and created a symbolic link with the same name under my home directory by using LD_PRELOAD=/lib/libc-2.11.3.so. At this point, all binaries linking libc are working through preload except sudo. For sudo, I need to write (and don't know why): $ /lib/ld-linux-x86-64.so.2 --library-path . /usr/bin/sudo but this gives me: $ sudo: must be setuid root Checking the permissions: $ ls -l /usr/bin/sudo $ -rwsr-xr-x 2 root root 166120 So the setuid bit is actually set. Question: I need to create a symbolic link named /lib/libc.so.6 through my active ssh connection without using sudo, or, make sudo work somehow. I don't have the root password and I can't connect through ssh anymore. Is there any other way I can get authorization?

    Read the article

  • Linux virtual disk stripping or multi-path samba share?

    - by wachpwnski
    I am trying to build a file storage box for media. It needs to span two or more directories or partitions as one share. There are a few solutions but reasons why I want to avoid them, among these are: Using LVM2 for stripping. I don't really have the resources to back up everything on the volumes incase one HDD goes south. I would end up loosing everything. Maybe there is a better option for this to prevent data loss with hot swappable drives or some kind of raid. Using symbolic links in the share. This will get tedious every time a new sub-directory is added. Is there some kind of software raid I can use to merge two directories virtually? I am aware of the issue where /dev/hda1/media/file.1 and /dev/hdb1/media/file.1 both exist. But I'm sure there are some creative solutions for this.

    Read the article

  • How to enable winhlp on Windows7 64bit?

    - by BGM
    Salvete! I just discovered that winhlp32.exe won't run on Windows7 64bit. I can't run the application, and I can't run hlp files either (but .chm files run fine). How do I make this work? I have downloaded the Microsoft fix here and restarted my computer, but to no avail. I can see the file winhlp32.exe in my c:\windows directory, but cannot run it. When I do run it, I get Windows' own "Help and Support" entitled, "Why can't I get Help from this program?" which sends me to the link above! How can I make it work?

    Read the article

  • How to migrate Notepad++ settings?

    - by NoCatharsis
    I am trying to portabilize every program I use if possible, and Notepad++ is on the list. The only problem is that I've had a native installation until now so that I'm not totally sure which settings files need to be moved to the portable directory. Surely there's a function tucked away somewhere in NPP exactly for this purpose, or some plugin out there? I mean the developers have literally thought of everything else, yet this is the one thing I cannot find specifically anywhere in the NPP wiki or otherwise, and I don't want to miss an important file. Here is the closest I've gotten: Notepad++'s configuration files and Where are all the files? Should I just copy every configuration file listed on the first link?

    Read the article

  • Debian doesn't boot after removing secondary hard drive

    - by Daveel
    In the beginning I had Debian 6 running on one hard drive (/dev/sda1). Then I decided to keep all my stuff(pics, videos, etc..) in another slave hard drive (/dev/sdb1). So sda1 has Debian OS sdb1 doesn't contain any OS files I have made it to mount automatically by adding a row in /etc/fstab (UUID and directory to mount to) Time have passed and when I tried to change that secondary hard drive with another hard drive with bigger capacity, for some reason Debian won't boot (just itself sda1) after removing secondary hard drive (sdb1) But if I plug sdb1 back, it boots just fine. I tried to comment line out from /etc/fstab, so it doesn't mount And also did update-grub after umount /dev/sdb1 What's the right way to remove hard drive secondary hard drive?

    Read the article

  • Avoid Windows Explorer to load complete executable file

    - by user13001
    On Windows Vista, when browsing to a network folder containing executables, Windows Explorer seems to load all the files completely just to be able to show the executable icon (the resource monitor indicates loads of traffic during the loading of the directory) On XP only a part of the file is loaded. Is there a way to avoid the complete loading of these files? Note that disabling my anti virus does not help. Update: This only happens with for executable linked with /SWAPRUN:NET. Microsoft confirmed this as a bug in Vista, but they seem not very eager to fix this.

    Read the article

  • 12.04 LTS Apache2 writing files from webpage at runtime has no effect - possible read/write permissions?

    - by J Green
    I'm running 12.04LTS with Apache and Mono in VirtualBox, with the goal of hosting a web app (coded in ASP.NET and C#) on my local network. The scripts on the page are able to successfully read from text files in the same directory as my site (/var/www/mysite/) but do not seem to be able to write. I'm sure the code works, because it did with my testing in Visual Web Developer on Windows. I don't get any errors, but when I click the button on the loaded webpage, the text file in question does not change. I'm fairly new to Linux in general, so I'm not too familiar with how to set permissions properly, and it may be a permissions issue. Unfortunately, I have searched all over the internet and haven't found a solution that worked, but I've tried (perhaps incorrectly) changing the owner of the files in question to www-root, changing the mode to a+rw, but sadly to no avail. I have tried everything here but it doesn't work: Whats the simplest way to edit and add files to "/var/www"? I hope someone can help me out.

    Read the article

< Previous Page | 504 505 506 507 508 509 510 511 512 513 514 515  | Next Page >