Search Results

Search found 12775 results on 511 pages for 'remote validation'.

Page 358/511 | < Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >

  • How to enable extension when running Firefox for the first time?

    - by spektom
    I need to run Firefox each time in a new profile directory with my extension enabled. What I do is the following: Create temporary directory for storing profile (/tmp/profile.123) Create extensions directory (/tmp/profile.123/extensions) Create extension proxy as described here (/tmp/profile.123/extensions/[email protected]) My command line looks like this: firefox -no-remote -profile /tmp/profile.123 -url http://www.google.com The problem is that my extension starts disabled, and I'm forced to enable it manually and restart the Firefox. Is it possible to make it start enabled in first place? Thanks! Workaround I've found: Create extensions.sqlite database file in the newly created profile folder. This file must contain my extension entry under the "addon" table.

    Read the article

  • Performance Monitor (perfmon) showing some unusual statistics

    - by Param
    Recently i have thought to used perfmon.msc to monitor process utilization of remote computer. But i am faced with some peculiar situation. Please see the below Print-screen I have selected three computer -- QDIT049, QDIT199V6 & QNIVN014. Please observer the processor Time % which i have marked in Red Circle. How it can be more than 100%.? The Total Processor Time can never go above 100%, am i right? If i am right? than why the processor time % is showing 200% Please let me know, how it is possible or where i have done mistake. Thanks & Regards, Param

    Read the article

  • Why does waking a PC up with a timer act differently than with the power button?

    - by Dan Rasmussen
    I have a Windows 7 machine set up as a server. It has no monitor and is only accessed through remote desktop. I set up two scheduled tasks, one to put the computer to sleep at night and another to wake it up in the morning. When it's woken up from sleep via a timer, it stays awake for only a couple minutes before going back to sleep. When woken up by pushing the power button, however, it stays awake all the way until the sleep timer. Why does my PC behave differently in these two scenarios? I have set the PC not to prompt for a user's password on wake, since I worried that the login screen might follow different power rules. I tried SmartPower Configuration but had the same problems. I can provide more details if questions are asked in the comments, but I'm not sure what's relevant.

    Read the article

  • Cannot open SQL 2005 database in SQL server Management Studio 2008 R2 on Windows 7

    - by Darryl Lawrence
    I have Windows 7 64bit as my OS + SQL Server 2008 R2 installed. I can connect to SQL Databases (2008), but cannot connect to a SQL 2005 database. I can, however, connect to the 2005 SQL Database on a PC that has Windows XP as the OS and also has SQL Server 2008 R2 installed. So it seems that it works fine on XP but not on Windows 7 (32 or 64bit). is this an Operating System issue? Error message: Cannot connect to OMRSQLV016\PRODSQL002. =================================== A network-related or instance-specific error occurred while establishing a connection to SQL Server. The server was not found or was not accessible. Verify that the instance name is correct and that SQL Server is configured to allow remote connections. (provider: SQL Network Interfaces, error: 26 - Error Locating Server/Instance Specified) (.Net SqlClient Data Provider) For help, click: http://go.microsoft.com/fwlink?ProdName=Microsoft+SQL+Server&EvtSrc=MSSQLServer&EvtID=-1&LinkId=20476 Error Number: -1 Severity: 20 State: 0

    Read the article

  • Backup a linux webserver to windows

    - by shaiss
    I have our websites hosted at a thrid party webserver. I have all the admin access needed. I have a local Win2K3 machine that's using retrospect to backup all the networked machines and server, navicat to backup the mysql dbs locally and on the remote linux webserver. So the only part that remains is incremental backups of the files on the webserver. Anyone have any suggestions on how to do this? rSync with deltacopy? Any others?

    Read the article

  • mail server administration

    - by kibs
    MY postfix does not show that it is listening to the smtp daemon getting mesaage below: The message WAS NOT relayed Reporting-MTA: dns; mail.mak.ac.ug Received-From-MTA: smtp; mail.mak.ac.ug ([127.0.0.1]) Arrival-Date: Wed, 19 May 2010 12:45:20 +0300 (EAT) Original-Recipient: rfc822;[email protected] Final-Recipient: rfc822;[email protected] Action: failed Status: 5.4.0 Remote-MTA: dns; 127.0.0.1 Diagnostic-Code: smtp; 554 5.4.0 Error: too many hops Last-Attempt-Date: Wed, 19 May 2010 12:45:20 +0300 (EAT) Final-Log-ID: 23434-08/A38QHg8z+0r7 undeliverable mail MTA BLOCKED OUTPUT FROM lsof -i tcp:25 command master 3014 root 12u IPv4 9429 TCP *:smtp (LISTEN) (Postfix as a user is missing )

    Read the article

  • FTP Issue when connecting to a debian machine from windows

    - by erin c
    I have a .net application which copies bunch of files to a specific FTP folder on a debian machine on periodical basis, ftp folder has 755 mod, owner of the directory is the ftp username that I authenticate in .net application. So far I tested this application with bunch of debian boxes, my initial attempts generally fail with following message if I try it with a debian machine that I haven't tried it before: "remote server returned an error 550 File unavailable" When I see this error, I log onto another debian machine on my network, and I try FTP'ing the debian box that returns the error message from command line. I generally "put" a very small file to the folder in question, right after that windows application starts copying files successfully via FTP. It is as if my command line ftp operation fixes the problem and makes debian compatible with my .net application. I checked permissions before and after the problem, it doesn't look like what I did changed anything, I am at loss understanding why this problem occurs and why it is fixed with my silly hack. Can anybody tell me where to look at next to fix this extremely annoying issue?

    Read the article

  • Bare bones backup / restore for single Win 2003 server

    - by s.mihai
    I have a single server, Win 2003 Server and would like to setup a system to be able to perform a bare bones restore if needed. (just plug in a cd or smth and get everything back). Ideally the backup could be performed while the is powered so that i don't have to support downtime during this, and in order to restore i would reboot and use some sort of liveCD Any ideas on this, software and all... ? (backup will be done to a remote FTP server with plenty of bandwidth).

    Read the article

  • Running PHP scripts as the owner of the PHP file: security issues

    - by thomasrutter
    I'm using suexec to ensure that PHP scripts (and other CGI/FastCGI apps) are run as the account holder associated with the relevant virtual host. This allows for securing each users' scripts from reading/writing by other users. However, it occurs to me that this opens up a different security hole. Previously, the web server ran as an unprivileged user, with read-only access to user's files (unless the user changed the file permissions for some reason). Now, the web user can also write to user's files. So while I've prevented different users taking advantage of each other's scripts, I've made it so that in the event that some application has a remote code injection vulnerability, it now has not only read access but also write access to all that user's scripts and website. How can I deal with this? One idea I've had is to create a second user account for each user account in the system, so that each user has their own user account, and all their scripts are run under another user account. But that seems cumbersome.

    Read the article

  • Worst SysAdmin Accident

    - by Ward
    In line with the question about Best sysadmin accident, what's the worst accident you've been involved in? Unlike the previous question, I mean "worst" in the sense of most system damage or actual harm to people. I'll start with mine: We have two remote wiring closets that are at the end of a 100-foot corridor which has a metal grate for the floor. After we had Cat6 cable installed, the contractors cleaned up all the debris that dropped through the grating to the concrete 3 feet below. A co-worker and I entered the corridor to check on the progress one day but were distracted and didn't notice that a piece of grating had been moved aside. My buddy stepped into air and his chest slammed into the steel crossbar. He was winded and sore enough to take a couple days off, but luckily the steel beam had rounded edges and the size of the opening was such that he didn't smack his head into it or the floor below. Obviously we learned that areas where the floor is partially removed need to be flagged.

    Read the article

  • What is the simplest, open-source webmail frontend available?

    - by josePhoenix
    I am working on a project to create a few extremely stripped down interfaces for common Web/Internet tasks in order to make computers accessible to my visually impaired grandmother. Currently she uses Mac OS X Mail.app, but I had the idea that I could re-skin a webmail interface running on my own server to make it easier for her to use. The ideal webmail interface to use as a starting point would be without frames or AJAX and written in Python, Perl, or PHP5+, though any setup could work as long as the template and stylesheet files were separate from the application itself. This frontend must also connect to a remote IMAP server, since her email account is with her ISP and not on my server. Can anyone recommend a bare-bones, no-nonsense webmail interface that would work for this?

    Read the article

  • zip being too nice (Mac OS X)

    - by stib
    I use zip to do a regular backup of a local directory onto a remote machine. They don't believe in things like rsync here, so it's the best I can do (?). Here's the script I use echo $(date)>>~/backuplog.txt; if [[ -e /Volumes/backup/ ]]; then cd /Volumes/Non-RAID_Storage/; for file in projects/*; do nice -n 10 zip -vru9 /Volumes/backup/nonRaidStorage.backup.zip "$file" 2>&1 | grep -v "zip info: local extra (21 bytes)">>~/backuplog.txt; done; else echo "backup volume not mounted">>~/backuplog.txt; fi This all works fine, except that zip never uses much CPU, so it seems to be taking longer than it should. It never seems to get above 5%. I tried making it nice -20 but that didn't make any difference. Is it just the network or disc speeds bottlenecking the process or am I doing something wrong?

    Read the article

  • Edit files from Cyberduck in an existing Vim window

    - by Eli Gundry
    I use Cyberduck as my go to FTP client on Windows. I have but one complaint, and that is whenever I click the edit button to edit the remote file with a local version of gVim, it opens in a new window/instance of Vim. This leads to a cluttered desktop as well as not allowing the AutoComplPop to work at it's full potential. What I would like to be able to do is automatically open every file in a new buffer inside of an existing gVim buffer instead of a new window, kind of like the Windows version of gVim and how it has the option to edit a file in a new buffer. Is there anyway to do this in Cyberduck/gVim setting?

    Read the article

  • Trying to script rsync using pam_exec

    - by Ricky-Rose
    I'm trying to write a bash script that will execute rsync when called by pam_exec. I've tried a couple different ways, and I'm not sure what I'm doing wrong. When I try to run the script at login by adding session optional pam_exec.so /usr/bin/local/sync.sh to my sshd file, it gives me an exit code of 12. if I log in and then manually run my script, it allows me to connect to the remote server, and it lists my files, but it doesn't actually sync anything. I have tried the code below using buth $USER and $PAM_USER. $PAM_USER doesn't work at all. #!/bin/sh rsync -azv -e ssh $USER@remote_server:/home/html/$USER/ /home/html/$USER

    Read the article

  • Are there any HTPC-optimized web browsers?

    - by smackfu
    Features that I would ideally include in "HTPC-optimized": Full-screen. Navigable using a remote or the keyboard arrows. Legible at couch distances. Or, to put it another way, imagine the design requirements for Hulu Desktop or XBMC or WMC, applied to a web browser. Opera on a Wii meets most of these criteria but not being HD wastes a lot of potential. If a single solution doesn't exist, is there some combination of Firefox add-ons that will get me there?

    Read the article

  • Site to site VPN using RRAS from an untrusted network?

    - by DrZaiusApeLord
    Our remote office will be moving to a new space where internet will be provided. They'll be behind a router doing NAT (I do not have admin rights to this router). They will be sharing a printer with the other people on the LAN, but will need VPN to our network for email and file shares. I was thinking of just having them run the windows VPN client and connecting via PPTP like they do when they are off-site, but I have read that multiple PPTP connections from the same NAT'd address to the same destination doesn't work well or at all. I am thinking some kind of site-to-site VPN is needed so there is just one tunnel. Can I just put in a VPN gateway, set it to connect to our RRAS/PPTP server, and have them use it as their default gateway? Perhaps even use the local default gateway for internet traffic. If so, what VPN gateway/device is recommended for this? Or other solutions? Thanks.

    Read the article

  • Increase backup speed, Backup Exec 2010 - QNAP TS419U+ NAS

    - by user99912
    We have a QNAP NAS and the network shares are being backed up by Backup Exec 2010 over SMB. We can't install the remote agent on the NAS as it has an ARM processor and, as far as I am aware, there is no compatible agent. Do you have any suggestions on any faster method of backing up these shares as opposed to the current scenario? Currently the network bandwidth is not the issue, it seems that this access method is just not able to go any quicker. We've also added the NAS shares to the start of the selection list, but we're still running into 18 hours total backup time (total amount of data on the NAS is roughly 650GB). Any comments and/or suggestions welcome. EDIT: Data is being pulled from the NAS by Backup Exec to a LTO4 tape drive

    Read the article

  • setting up delegate or smtp forwarding

    - by cotiso
    for work we have a remote dedicated server to run our webservice that also runs our email services, at home(comcast residential internet) i cannot send mail using the dedicated server's SMTP, comcast spits back a error saying i can only use their SMTP server for sending mail at work(comcast business internet) we can use our dedicated server for sending mail with no problem so i set up a box at work to forward smtp traffic, i'm new to all this networking stuff by the way i used delegate to forward smtp traffic, can someone point me in the right direction on how to use this program(delegate) to fix our issue the delegate command i used to test is : delegated -P25 SERVER="smtp://dedicated.server.com:25" PERMIT=":::" -v i also opened up port 25 on the router so it points to my boxes ip are there any other ways to fool comcast into thinking im using my works ip to send mail, my coworkers and i are unable to send mail from home for some time now thanks

    Read the article

  • Persistent network share connection not working with runas

    - by binarycoder
    If I use runas /user:DOMAIN\user cmd.exe (using XP), previously mapped persistent network drives are considered unavailable. net use shows: Status Local Remote Network ------------------------------------------------------------------------------- Unavailable H: \\SERVER\SHARE Microsoft Windows Network dir H: fails with "The system cannot find the path specified.". The connection is easily revived with `NET USE H: \SERVER\SHARE': not asked for a password when I do this. What is going on? Can I make Windows safely revive this drive automatically when it is first accessed.

    Read the article

  • squid configuration change to accept http request on LAN

    - by Ratan Kumar
    installed squid + dansguardian to block adult content on my linux (ubuntu 12.10) . everything worked fine. it has blocked as expected . now the problem is i am also running an apache server for my LAN . ( kind of website ) but when accessing it via 192.168.0.1 , it says squid has blocked the connection , this is the exact error The following error was encountered while trying to retrieve the URL: http: //192.168.0.16/ Connection to 192.168.0.16 failed. The system returned: (113) No route to host The remote host or network may be down. Please try the request again. Your cache administrator is webmaster. before configuring the squid it was working fine . what changes in the squid.conf i have to make . i tried acl Safe_ports 80 allow_all Safe_ports ( i want to know how i can configure it again to listen HTTP request from LAN )

    Read the article

  • Security issues of running PHP scripts as the owner of the PHP file with suexec

    - by thomasrutter
    I'm using suexec to ensure that PHP scripts (and other CGI/FastCGI apps) are run as the account holder associated with the relevant virtual host. This allows for securing each users' scripts from reading/writing by other users. However, it occurs to me that this opens up a different security hole. Previously, the web server ran as an unprivileged user, with read-only access to user's files (unless the user changed the file permissions for some reason). Now, the web server can also write to user's files. So while I've prevented different users taking advantage of each other's scripts, I've made it so that in the event that some application has a remote code injection vulnerability, it now has not only read access but also write access to all that user's scripts and website. How can I deal with this? One idea I've had is to create a second user account for each user account in the system, so that each user has their own user account, and all their scripts are run under another user account. But that seems cumbersome.

    Read the article

  • Correct password for ssh key rejected when ssh-d into machine

    - by user20342
    When I am logged into my machine directly, I can do all git operations, and when prompted for a password, the password is accepted. When I ssh into the same box and run git operations on the same repos, the password is rejected. Relevant section of .ssh/config looks like this: # Generic settings Host * ServerAliveInterval 600 ControlPath /tmp/ssh-%r@%h:%p ControlMaster auto KeepAlive yes IdentityFile ~/.ssh/id_rsa.pub Transaction looks like this when I login when I ssh into my box: {12-12-03 9:41}hbrown-wks2:~/workspace/spt/project@master??? hbrown% git pull Enter passphrase for key '/home/hbrown/.ssh/id_rsa.pub': Enter passphrase for key '/home/hbrown/.ssh/id_rsa.pub': Enter passphrase for key '/home/hbrown/.ssh/id_rsa.pub': Permission denied (publickey). fatal: Could not read from remote repository. Please make sure you have the correct access rights and the repository exists. Using bash does not appear to make a difference (i.e. ssh-agent /bin/bash). This is a recent development, but I can't cite the change that caused it.

    Read the article

  • How can I share a video file during a webinar?

    - by Brien Malone
    Here is the scenario: I have a number of remote employees around the globe. I want to have a video chatting session. No problem there. Halfway through, I want to shut off all camera video feeds and simulcast (synchronous) a training video to my team. How do I do this? We have tried office communicator, but the frame rate was awful and no audio. Adobe Connect had similar trouble. In both cases we were limited by the main office's small internet pipe, but it is clear that video delivered by shared desktop is not a good solution.

    Read the article

  • /dev/fuse "permission denied" even when member of fuse group

    - by steeef
    I have a backup script scheduled on a Debian 5.0 x86 server, via sshfs. However, when I attempt to mount the remote directory, I receive: failed to open /dev/fuse: Permission denied ls -l /dev/fuse returns: crwxrwxr-x 1 root fuse 10, 229 2010-11-12 09:08 /dev/fuse id backup returns: uid=501(backup) gid=501(backup) groups=501(backup),46(plugdev),108(fuse) The only way I can get the directory to mount is if I run chmod a+w /dev/fuse, but this is reset at some point during the day. It's a kludge though, and I'd rather figure out why the group permissions aren't working.

    Read the article

  • windows service log on as user a/c on different PC on same workgroup

    - by maruti
    trying to run a service (logon as admin@PC2) from PC1, when both are in work-group fails. why could this happen? OS is win-2003 and please let me know if any windows remote services have to be turned on or firewall configuration? does having PC's on same workgroup help? let me clarify the question: I am unable to see other computers from "Services Logon Tab select User" Object types available are only "users, built in security principals" Location is only local computer. But this is available from mmc console..add snap in how can this be available on services control panel?

    Read the article

< Previous Page | 354 355 356 357 358 359 360 361 362 363 364 365  | Next Page >