Search Results

Search found 14231 results on 570 pages for 'folder redirection'.

Page 411/570 | < Previous Page | 407 408 409 410 411 412 413 414 415 416 417 418  | Next Page >

  • Mac OS X desktop background gets reset to "Andromeda Galaxy.jpg" when users log out

    - by smokris
    I'm running Mac OS 10.6 server, and have a set of 10.6 and 10.7 workstations connected to it for authentication and MCX. Users have network profiles (home folder stored on server, via AFP). When users are logged in, they can change their desktop background. So far so good. However, the next time they log in, their desktop background has been reset to "Andromeda Galaxy.jpg". Though MCX is enabled and used to control other settings, MCX is disabled for the Desktop. What is keeping the users' desktop background from being preserved? How can I fix it?

    Read the article

  • Unable to list contents/remove directory (linux ext3)

    - by RedKrieg
    System is CentOS5 x86_64, completely up to date. I've got a folder that can't be listed (ls just hangs, eating memory until it is killed). The directory size is nearly 500k: root@server [/home/user/public_html/domain.com/wp-content/uploads/2010/03]# stat . File: `.' Size: 458752 Blocks: 904 IO Block: 4096 directory Device: 812h/2066d Inode: 44499071 Links: 2 Access: (0755/drwxr-xr-x) Uid: ( 3292/ user) Gid: ( 3287/ user) Access: 2012-06-29 17:31:47.000000000 -0400 Modify: 2012-10-23 14:41:58.000000000 -0400 Change: 2012-10-23 14:41:58.000000000 -0400 I can see the file names if I use ls -1f, but it just repeats the same 48 files ad infinitum, all of which have non-ascii characters somewhere in the file name: La-critic\363-al-servicio-la-privacidad-300x160.jpg When I try to access the files (say to copy them or remove them) I get messages like the following: lstat("/home/user/public_html/domain.com/wp-content/uploads/2010/03/Sebast\355an-Pi\361era-el-balc\363n-150x120.jpg", 0x7fff364c52c0) = -1 ENOENT (No such file or directory) I tried altering the code found on this man page and modified the code to call unlink for each file. I get the same ENOENT error from the unlink call: unlink("/home/user/public_html/domain.com/wp-content/uploads/2010/03/Marca-naci\363n-Madrid-150x120.jpg") = -1 ENOENT (No such file or directory) I also straced a "touch", grabbed the syscalls it makes and replicated them, then tried to unlink the resulting file by name. This works fine, but the folder still contains an entry by the same name after the operation completes and the program runs for an arbitrarily long time (strace output ended up at 20GB after 5 minutes and I stopped the process). I'm stumped on this one, I'd really prefer not to have to take this production machine (hundreds of customers) offline to fsck the filesystem, but I'm leaning toward that being the only option at this point. If anyone's had success using other methods for removing files (by inode number, I can get those with the getdents code) I'd love to hear them. (Yes, I've tried find . -inum <inode> -exec rm -fv {} \; and it still has the problem with unlink returning ENOENT) For those interested, here's the diff between that man page's code and mine. I didn't bother with error checking on mallocs, etc because I'm lazy and this is a one-off: root@server [~]# diff -u listdir-orig.c listdir.c --- listdir-orig.c 2012-10-23 15:10:02.000000000 -0400 +++ listdir.c 2012-10-23 14:59:47.000000000 -0400 @@ -6,6 +6,7 @@ #include <stdlib.h> #include <sys/stat.h> #include <sys/syscall.h> +#include <string.h> #define handle_error(msg) \ do { perror(msg); exit(EXIT_FAILURE); } while (0) @@ -17,7 +18,7 @@ char d_name[]; }; -#define BUF_SIZE 1024 +#define BUF_SIZE 1024*1024*5 int main(int argc, char *argv[]) { @@ -26,11 +27,16 @@ struct linux_dirent *d; int bpos; char d_type; + int deleted; + int file_descriptor; fd = open(argc > 1 ? argv[1] : ".", O_RDONLY | O_DIRECTORY); if (fd == -1) handle_error("open"); + char* full_path; + char* fd_path; + for ( ; ; ) { nread = syscall(SYS_getdents, fd, buf, BUF_SIZE); if (nread == -1) @@ -55,7 +61,24 @@ printf("%4d %10lld %s\n", d->d_reclen, (long long) d->d_off, (char *) d->d_name); bpos += d->d_reclen; + if ( d_type == DT_REG ) + { + full_path = malloc(strlen((char *) d->d_name) + strlen(argv[1]) + 2); //One for the /, one for the \0 + strcpy(full_path, argv[1]); + strcat(full_path, (char *) d->d_name); + + //We're going to try to "touch" the file. + //file_descriptor = open(full_path, O_WRONLY|O_CREAT|O_NOCTTY|O_NONBLOCK, 0666); + //fd_path = malloc(32); //Lazy, only really needs 16 + //sprintf(fd_path, "/proc/self/fd/%d", file_descriptor); + //utimes(fd_path, NULL); + //close(file_descriptor); + deleted = unlink(full_path); + if ( deleted == -1 ) printf("Error unlinking file\n"); + break; //Break on first try + } } + break; //Break on first try } exit(EXIT_SUCCESS);

    Read the article

  • not able to upload files into mediawiki -- weird one

    - by Michael
    Completely frustrating me. When I try and upload a small jpeg file I get the following error: Warning: wfMkdirParents: failed to mkdir "/usr/local/mediawiki-1.20.5/images/5/5d" mode 0777 in /usr/local/mediawiki-1.20.5/includes/GlobalFunctions.php on line 2546 CentOS 6.4 MediaWiki 1.20.5 PHP 5.5.0RC1 (apache2handler) MySQL 5.5.31 php.ini safe_mode = off; file_uploads = On max_file_uploads = 20 localsettings.php $wgEnableUploads = true; $wgUseImageMagick = true; $wgImageMagickConvertCommand = "/usr/bin/convert"; images folder chown apache:apache images/ chmod 755 -R images/ (threw error) chmod 777 -R images/ (threw error) I've restart apache and still cannot upload. I'm stumped. Any ideas?

    Read the article

  • How do I now access my site for an installation

    - by user4524
    I have just rented a virtual private server with DirectAdmin. I have an ip address, lets say its: 178.239.60.18 Now I have made a new domain on the server. It resides in a folder called: example. Now when I would like to acces this in a browser, I type in 178.239.60.18/example or 178.239.60.18:example But this does not work. What am I doing wrong? When I look at the DNS record it does say the the ip address for example is 178.239.60.18

    Read the article

  • Firefox hangs with java in Linux

    - by Xolve
    I am using Firefox 3.6 on Linux and I haveJava 6 update 15. Obviously I am using the newer plugin type. Whenever I open a page with Java applet it just becoms unresponsive and hangs. I tried with Chrome. I think it detects the Java plugin from mozilla's plugins folder. Due to multiprocess architecture it doesn't hang and shows the initial applet loading window but then it reports that the plugin has crashed. The only way I can run Java applets is to download the class file and make an html file and run the applet using applet viewer.

    Read the article

  • Cygwin file and directory user and group

    - by dvanaria
    I use Cygwin as my main development environment on both my home and work computers. In order to share files between the two computers, I use Dropbox, which is installed in the following folder on both computers: c:\cygwin\home\dvanaria\dropbox Everything works great, except for one thing. When I'm working on my home computer and do an ls -l on any directory, all the files show up as owned by dvanaria of group Users. But when I work from my work computer, an ls -l shows all files as being owned by Administrators and of group Domain Users. I know Cygwin uses some kind of mapping between Windows users and permissions to the /etc/passwd file. But to be honest I have no idea how this file works or how it maps to Windows under Cygwin. Could anyone help figure this out? The main problem is that I can't edit any files when using my work computer, only read them.

    Read the article

  • Windows Vista Backups?

    - by skaz
    I am trying to configure a Windows Backup on Vista but don't see some capabilities I would expect to be there. For one, it looks like I can only select a Local Drive or a network share. I want to use a local drive, but I want to use a sub folder of one of the drives. Must I really pick the root? As a work-around, I made a network share to the local drive, thinking I could then pick network share. However, when I do this, I am prompted for credentials to hit the share, and none work. However, the share works Explorer, and it works from other computers, so the access is configured correctly. Is there any way to do what I am trying to do? Thanks.

    Read the article

  • gmail download by POP3 won't download all emails. How to reset all emails to not downloaded so that ALL will download?

    - by Rob
    I want to download emails from gmail using POP3 with Outlook Express. It downloads about 350 or so emails but doesn't download the remainder - there are over 2000 emails. The emails downloaded are not recent. I've tried disabling and re-enabling POP options in the settings in gmail itself but this doesn't fix the issue. Any ideas? Failing that I would use IMAP. I would try to then copy it locally on my machine to the standard POP Inbox folder in Outlook Express so that Express Archiver (a separate program) can then archive each email as a file with meaningful file names (e.g. subject, sender). I want to download email because I archive back it up with project work material it relates to, so it is all in one place.

    Read the article

  • Video/films organizing software for mac

    - by tig
    I am looking for application that will help me organize movies, clips and other videos I have. I tell it a folder to watch, and then I can mark every video there as watched/unwatched, set rating, add description, tag it. And possibility to open with preferred video player (for me — VLC). Option to get all info from imdb or other source will be very good. yFlicks would be a good one, but it uses quicktime, so it doesn't like any non standard codec and container (for example mkv), it doesn't work well on Snow Leopard. Any suggestions?

    Read the article

  • How to allow Mac OS X's native Apache/PHP installation to access WebServer directories?

    - by Martin Bean
    I have a problem bugging me with Mac OS X's native Apache/PHP installation. With my PHP scripts, I have to alter the file permissions on each folder I want to access. For example, in an upload script I would have to set the destination directory to 'read & write' for the group 'everyone'. However, I believe this is not the best practice and would like all of my directories to be readily writable to PHP. My scripts are stored in /Library/WebServer/Documents/, which is Mac OS X's default directory to serve web pages locally.

    Read the article

  • How can I stop ntbackup requiring my new password every time I'm forced to change my Windows passwor

    - by Lunatik
    I have a scheduled job that runs each night using ntbackup which copies a folder on my HDD to a network share. The problem is that every time I'm required to change my Windows password I have to remember to change it in ntbackup aswell, otherwise the backup fails silently i.e. I get no warning that the backup isn't being done. Is there a way to schedule this job so it will automatically pick up my new Windows password, or somehow not be tied to my main login? My user account type is Debugger, not full Administrator, so I'm not sure if that would restrict me in any way, e.g. still forcing a four-weekly password change on a dedicated user account for this. The PC runs XP SP2 on a Windows Server 2003 R2 domain.

    Read the article

  • Find hosted directories/ports in Jetty/Apache

    - by Paul Creasey
    Hi, I first asked this on SO, but i didn't get a response and i think it is probably more appropriate here. Let say I have a directory which is being hosted by Jetty or Apache (i'd like an answer for both), i know the URL including the port and i can log into the server. How can i find the directory that is being hosted by a certain port? I'd also like to go the other way, i have a folder on the server, which i know if being hosted, but i don't know the port so i can't find it in a web browser. How can i find a list of directories that are being hosted? This has been bugging me for ages but i've never bothered to ask before! Thanks.

    Read the article

  • Upgrade manually-installed msi by assigning software through gpo

    - by Mr Happy
    In the past I rolled out software by manually installing it on a "golden" workstation, creating an (ghost)image from that and rolling that out to the other workstations. I try not to do that any more for simple/small software, and when possible (if it's an .msi) I assign the software through gpo. I'm having a problem with one of those. The software was manually installed on the image, which was rolled out, and now I have an update for that software (new msi) and I'd like to assign it through gpo. Don't know if it's relevant but it's user assigned. The new version gets installed alongside with the old version (this is possible since the program folder is different between those versions). When I install the same msi by hand, it properly removes/upgrades the old version though. Is what I am trying to do possible?

    Read the article

  • SharePoint 2007 Enabling Incoming Email Error

    - by Cherie Riesberg
    Symptom: When configuring incoming e-mail, the e-mails come through just fine if the server name is in the e-mail address: [email protected] but when you change it to a vanity name [email protected], the message is bounced back and you get this error: Delivery has failed to these recipients or distribution lists: [email protected] Your message wasn't delivered because of security policies. Microsoft Exchange will not try to redeliver this  message for you.    Please provide the following diagnostic text to your system administrator. The following organization rejected your message: servername01.fqdn.com.   Problem: The SharePoint server relay rejects the message because it doesn't recognize the name.  You have set it up in Exchange, but you need to set up an alias in the SMTP service on the SharePoint server;   Solution: Configure an Alias Domain An alias domain is an alias of the default domain. You can set up alias domains that use the same settings as the default domain. Messages that are received by the SMTP Service for an alias domain are placed in the Drop folder that is designated for the default domain. To configure an alias domain, follow these steps: Start IIS Manager or open the IIS snap-in. Expand Server_name, where Server_name is the name of the server, and then expand the SMTP virtual server that you want (for example, Default SMTP Virtual Server). Right-click Domains, point to New, and then click Domain. The New SMTP Domain Wizard starts. Click Alias, and then click Next. Type a name for the alias domain in the Name box, and then click Finish. Quit IIS Manager or close the IIS snap-in.

    Read the article

  • Add a right click context menu in certain WIndows folders that will open a web browser URL?

    - by jasondavis
    In a Windows Explorer window where you browse files in Windows 7, I would like to add a new context menu that will allow me to open a file on my local Dev Server. So it would have to open a browser like Google Crome and the URL would have to be the file path but slightly different removing part of it and prepending my localhost URL. For example if the file I am right clicking on, the path for that file might be... E:\Server\htdocs\labs\php\testProject\test.php I would need a button to click in the context menu Open in Browser and it would open my Web Browser with a URL like this... http://localhost/labs/php/testProject/test.php I would love to be able to do this, any ideas or help would greatly be appreciated! To go one step further, would to be able to somehow make the context menu item only show up on File that are under this folder.... ``E:\Server\htdocs` but this is far less important.

    Read the article

  • Exchange 2010 Autodiscover/OAB update issue

    - by bulldog5046
    Mid way through migration from 2003 to 2010 and with a few test users on 2010 i've noticed that the OAB is not being downloaded to outlook clients. I've checked the URL's are configured, addded both our CAS servers to the web based distribution list for the OAB and assigned the OAB to 2 mailbox databases we use but when i use outlook 'Test E-Mail AutoConfiguration' test i still see that the autodiscover says "OAB URL: Public Folder" even though i've now deselected the option. I've ran Test-OutlookWebServices to which i was getting an OAB error about no URL in the autodiscovery but having just re-ran it now appears fine, yey the autoconfigure test still does not. Does anyone have any idea why i'm getting this discrepency?

    Read the article

  • Windows server response time very high

    - by Nagaraju Bandla
    Server Specs Windows Server 2008 R2 64 bit Provider : Fasthosts .Net Framework: 4.0 6 GB RAM (its using 4.6 GB) i have a website with thousands of pages structured like folderone/1/one to 500.aspx folderone/2/one to 500.aspx . . folderone/500/one to 500.aspx To load this pages for the first time after the release, for each folder it takes about 20 to 30 minutes and once one page is loaded the rest of the pages loads fine. This happens for all folders. And this repeats every time i restart the server, when a added anything to app_code or if i change the web.config. My site is mainly works Google and due to this problem its giving errors. Any help will be highly appreciated please. i am happy to buy a beer for you if its resolved. Thanks in advance...

    Read the article

  • Windows Home Server Passwords Do Not Match [closed]

    - by Ben Fulton
    I have a Windows Home Server that chunks along just fine most of the time. I've never bothered to put it on a GPS and so it's vulnerable to power outages that happen a few times a year. This most recent time, it came back and seemed to be fine, but whenever I try to access a shared folder I get "Passwords do not match". They matched before the power went out, and I couldn't update the WHS password since I apparently didn't know the old one. How do I fix this?

    Read the article

  • NFS users getting a laggy GUI expierence

    - by elzilrac
    I am setting up a system (ubuntu 12.04) that uses ldap, pam, and autofs to load users and their home folders from a remote server. One of the options for login is sitting down at the machine and starting a GUI session. Programs such as chormium (browser) that preform many read/write operations in the ~/.cache and ~/.config files are slowing down the GUI experience as well as putting strain of the NFS server that is causing other users to have problems. Ubuntu had the handy-dandy XDG_CONFIG_HOME and XDG_CACHE_HOME variables that can be set to change the default location of .cache and .config from the home folder to somewhere else. There are several places to set them, but most of them are not optimal. /etc/environment pros: will work across all shells cons: cannot use variables like $USER so that you can't make users have different new locations for .cache and .config. Every users' new location would be the same directory. /etc/bash.bashrc pros: $USER works, so you can place them in different folders cons: only gets run for bash compatible shells ~/.pam_environment pros: works regardless of shell cons: cannot use system variables (like $USER), has it's own syntax, and has to be created for every user

    Read the article

  • Use Windows 7 offline sync with external usb hd

    - by René
    Yeah, truly the whole question in the header. Is there a way to use Windows 7 offline sync (which we know from network mapped drives) with a external usb hd? When not, are there similar built in tools or good third party tools? My scenario: I want to buy a ultrabook with SSD which is mostly limited in space. So I'm going to put all files to a external HD and only store current projects on the local SSD. Let's say I have to change project. It would be easy just change sync folders and have the second project synced to my hd too. With network mapped drives it's such easy. Paths do not differ if the drive is offline so in most situations you don't take notice if the folder is offline. And you only have to activate offline file for the folders you courrently need for work. So is there a similar solution for usb hard drives?

    Read the article

  • Correção de permissão de pastas [closed]

    - by Cezar Luiz
    Olá todos boa tarde. Fiz uma mer... aqui e uma pasta minha ficou assim. ls -lha total 20K ?--------- ? ? ? ? ? brsdinfra001 Onde deveria ter alguma coisa como drwxrwxr-x 2 nobody nobody 4.0K Oct 4 09:45 Alguem sabe como consertar isso? ENGLISH TRANSLATION: Hello good afternoon everyone. I made a mer ... and here I was just a folder. ls-lha 20K total? ---------? ? ? ? ? brsdinfra001 Where should have something like drwxrwxr-x 2 nobody nobody 4.0K Oct 4 09:45 Anyone know how to fix this?

    Read the article

  • How to modify partitions after install?

    - by ChocoDeveloper
    I wanted to have Ubuntu with full disk encryption on one big partition, and Windows on a small one. In 12.04, only the Server Edition installer has full disk encryption, so I used that and then installed ubuntu-desktop. When it asked for the size, I reduced it from ~999GB to ~750gb. Now after the install, on both gparted and disk utility I see /dev/sda2 taking ~931GB, and nothing unallocated, so I can't create a partition for windows. I got the size right, because when I right-click inside a folder, then hit 'properties', I see Free space: ~690GB (I don't know why it's not ~750GB, but at least it's not 900). The command df -h shows the same. So what can I do? Normally I would just resize a partition with gparted to create unallocated space, then create the partition. But here I have two problems: gparted does not seem to be showing the correct values, and also it says it does not support LUKS so I'm afraid it will mess things up. Any thoughts?

    Read the article

  • Extensions disappear when I close and open Google Chrome

    - by PavanM
    I am running the latest version of Google Chrome 23.0.1271.97 (Official Build 171054) m on Windows 7. Any new extension I install simply disappears(not disabled, total disappearance) once I close and re-start google chrome. This is not happening to one of my old extension. It stays there across chrome re-starts. I tried everything google help suggested- I created new user profile by renaming the Defaults folder I checked for any permission change that the extensions might have undergone. This is not the case. I am not running in developer mode. This happens when I close ALL instances of google chrome. Even if one instance of chrome is running, this doesn't happen. But I cant have an instance of Google Chrome always running :( I even reported the issue to Google Chrome team to no avail and new.crbug.com is offline. And I skimmed through many threads opened for the same issue only to find souls like me. SE is my last resort :)

    Read the article

  • linux: upload / download difference on network shares

    - by Batsu
    I have a Red Hat Enterprise Linux 6 (with SELinux) which shows significant differences of speed between download and upload (the latter significantly slower) of files shared over the LAN. The bottleneck seems to be the output of the linux machine since I have a rate around 1Mb/s when WinXP machines download files shared (using samba) by the RHEL machine uploading files from the RHEL to a WinXP's shared folder while uploading from the XP machines to linux's shares downloading XPs' shares on the RHEL any share between Windows machines only run smooth (around 50Mb/s). Since the upload from RHEL to WinXP's share is slowed too I would exclude an issue in the configuration of samba. What could possibly determine this limit in the upload speed? update: iptables doesn't show any output rule and disabling it doesn't show any noticeable difference, so I would rule out it too.

    Read the article

  • Windows 7 'All Programs' folders self-close on right-click

    - by Madmanguruman
    Odd issue on my Windows 7 Professional (32-bit) system. If I click on the Start 'orb' then navigate to 'All Programs', then navigate to any of the folders that appear at the bottom of the list (like Accessories) and left-click, the folder contents expand with no issue. If I right-click, the context menu appears for half-a-second or so, then the popup goes away and the entire start menu dismisses. I'm not sure how to debug this issue - I'm considering using Autoruns to try and disable things hooked into the shell one-by-one. Is there a way to use a tool like Process Explorer to narrow down the process that's actually dismissing the menus?

    Read the article

< Previous Page | 407 408 409 410 411 412 413 414 415 416 417 418  | Next Page >