Search Results

Search found 13411 results on 537 pages for 'proxy servers'.

Page 350/537 | < Previous Page | 346 347 348 349 350 351 352 353 354 355 356 357  | Next Page >

  • Best practice? Using DPM to backup VMs within each VM or through the host?

    - by andrew
    We've got two Hyper-V hosts running multiple VMs (all flavors of Windows Servers). One of the VMs is running MS Data Protection Manager 2010, which runs beautifully (most of the time) and is connected to a separate NAS via iSCSI for the DPM storage. I noticed when I installed the DPM agent on the Hyper-V hosts, it enumerates the VMs in the DPM Protection listing. I don't want to burn through my storage space too fast with duplicate protection, so I was wondering: Is it recommended to back up VMs through the host, or is it better to install the DPM agent on each VM and backup as I would any other machine? It would seem as though most people (currently including me) do it the second way, but is there any advantage to including the entries under HyperV (Backup using Child Partition Snapshop)?

    Read the article

  • ESXi 5.1 on Poweredge 510 freezes after base-esx update

    - by goober
    Background / Problem Just experienced an issue where an ESXi host was upgraded from 5.0 -- 5.1 perfectly fine. Then, I did a scan and remediated a patch (ESXi510-201210401-BG) Looking into the host on via the kvm switch, this appears to complete successfully. However, on reboot, the server hangs at the "Initializing Power Management" phase. I've read from various spots around the internet that this usually clears itself up again upon a cold boot, but given that our servers are in a different building with different access rules, the less I have to physically go there, the better. :) Question Is there anything I can do to avoid an ESXi host hanging at the "initialize power management" phase of boot after remediating the host to apply patches?

    Read the article

  • Web Application Vulnerability Scanner suggestions?

    - by Chris_K
    I'm looking for a new tool for the ol' admin toolkit and would value some suggestions. I would like to do some "automated" testing of handful of websites for XSS (cross site scripting) vulns, along with checking for SQL injection opportunities. I realize that an automated tool approach isn't necessarily the only or best solution, but I'm hoping it would give me a nice start. The sites I need to scan cover the range in stacks from PHP / MySQL to Coldfusion, with some classic ASP and ASP.NET mixed in for good measure. What tools would you use to scan for Web application vulns? (Please note I'm focusing on the web apps directly, not the servers themselves).

    Read the article

  • apache is up but does not read requests

    - by bosh
    This usually happens a few minutes after restarting apache: httpd daemons are up, but are not reading the requests from the sockets. The web clients just wait forever on the connection. When I run netstat, the Recv-Qs are showing a positive byte count which does not change. So basically the connection between the client and apache is in the CONNECTED state but no progress is made. Restarting apache solves the problem for a couple of minutes, but then it's deja vu all over again. Other servers (sshd, ftpd, etc) are fine. Where should I look further? Any clue? Thanks!

    Read the article

  • How do we keep Active Directory resilient across multiple sites?

    - by Alistair Bell
    I handle much of the IT for a company of around 100 people, spread across about five sites worldwide. We're using Active Directory for authentication, mostly served to Linux (CentOS 5) systems via LDAP. We've been suffering through a spate of events where the IP tunnel between the two major sites goes down and the secondary domain controller at one site can't contact the primary domain controller at the other. It seems that the secondary domain controller starts denying user authentication within minutes of losing connectivity to the primary. How do we make the secondary domain controller more resilient to downtime? Is there a way for it to cache the entire directory and/or at least keep enough information locally to survive a multi-hour disconnection? (We're all in a single organizational unit if that makes any difference.) (The servers here are Windows Server 2003; don't assume that we set this up correctly. I'm a software engineer, not an IT specialist.)

    Read the article

  • Anyone know where I can download a copy of Sun Java System Active Server Pages 4.0.3 for Solaris

    - by ewengcameron
    I've contacted Sun regarding this and they have told me that the download is no longer available as Active Server Pages 4.0.3 is now End Of Life. We need to upgrade our server to 4.0.3 to acheive PCI-DSS compliance. Anyone know of a site where I can download older copies of Sun files? Sun offer 4.0.1 and 4.0.2 to download but not 4.0.3 which is going to cause problems come October when Visa stops accepting transactions from non PCI compliant servers. If Sun kept their naming system consistent across versions, the file would be called "sjsasp403-sol-sparc.tar". I know the real solution is to upgrade every site on the server to use a different server language, i.e. PHP, and in the long term, this is our goal but we have over 100 sites requiring upgrading and its not a viable solution to get this done before October.

    Read the article

  • xauth error with ssh X Forwarding

    - by bdk
    From my (Debain) Desktop machine, I am trying to ssh into a Debian Server with ssh -X remote-ip After logging into the remote host, I get: /usr/bin/X11/xauth: creating new authority file /root/.Xauthority /usr/bin/X11/xauth: (stdin):1: bad display name "unix:10.0" in "remove" command /usr/bin/X11/xauth: (stdin):2: bad display name "unix:10.0" in "add" command And the X Forwarding doesn't work. From my Desktop I can ssh -X into other Debian servers and it works fine. I found a lot of threads discussing similar issues on google, but they all seem to fade out without a solution, and the simple things suggested there like exporting DISPLAY or setting xhost + don't seem to make a difference.

    Read the article

  • How to config mysql-server for heavy load

    - by Rasmus
    Im in the process of setting up a new database server. I have been running a few mysql database servers before and it has been working okay. But i would like to hear the recommended setup for my server. For example, what should i set the max connection, query_cache_size, table_cache and so on. I have arround 4-600 per second: Open tables: 112 Queries per second avg: 430.386. The server i am setting it up on have the following configuration: Linux version 2.6.32-5-amd64 (Debian 2.6.32-41squeeze2) 2x Intel Xeon X3440 @ 2.53GHz 4GB Ram /, /boot, /tmp etc on Software RAID1, 2x 7200RPM SATA Data location on Software RAID0, 2x7200RPM SATA Im am going to place the mysql databases on the RAID0. Am im missing anything? Let me know! Thanks in advance, im looking forward to hearing from you :-) /Rasmus

    Read the article

  • Email alerts when hard drive fails on a Dell PowerEdge 2950 (PERC5I, SAS)?

    - by BigJoe714
    I recently purchased a used Dell PowerEdge 2950. I setup the hard drives in RAID-5 configuration. I want to be able to get an email alert if one of the drives fails. I have been trying to determine what the easiest way to setup an email alert would be. The controller card is listed as PERC5I, SAS PowerEdge. From my numerous Google searches, it looks like I need to install Dell OpenManage Essentials. However ,this looks to be a giant application with tons of bells & whistles for managing many servers, when all I really want is something for this one server. Can anyone offer me any insight into what I could do?

    Read the article

  • Internal+external interfaces with multiple default gateways on win2003

    - by fileitup
    Im trying to set up several web servers for a load balanced cluster and need to have each server connected to the internal network (for load balancing) as well as to an external network (internet - for administration). I have two NICs but since I cant set two default gateways I have the external gateway as default and the internal as a route rule. This setup only works half way - the internal network is fine but I cant log in from outside or see the web from the box. If I switch the gateways remote login/web will work, but the internal wont. Im sure someone encountered this before but wasnt able to find anything online. Any help will be appreciated.

    Read the article

  • Toolkit & Habits for Linux Network & System Administration [closed]

    - by slashmais
    I am tasked with the administration of a small office network as well as several workstations running mostly Debian and Ubuntu. There are two servers: one database & print-server, and one backup & file server. Being relatively new to this side of things, knowing enough to help myself to some degree on Linux, I would like to know what software tools and tasks/habits I can use/acquire to learn this field and be effective while doings so. I don't need to know what is the best, just what a newbie sys-admin can use as a starter-pack to learn and use as a base to grow into proper system administration. [edit] What I need is those few basic tools to start with, and the kind of things I need to do regularly, e.g.: which logs to check, when & what to monitor, the kind of 'right' place to start and to which I can ad as I need.

    Read the article

  • Why powershell runs executables in separate window?

    - by Artem Tikhomirov
    On one of my servers (2008 R2) powershell refuses to run executables without extension, so typing cmd (or &cmd) in command prompt results in folowing error message: The term 'cmd' is not recognized as the name of a cmdlet Invoking executable one of the following ways pops out separate window (which executes asynchronously in respect to parent). I've tried that in x86 version of powershell and in x64 one. I've tried -Noprofile argument. PATH seems to be OK. It includes System32 and all. The only way I've managed to execute cmd inline form powershell is opening standard cmd.exe shell, executing powershell.exe from it and executing cmd /c echo test from it. Inception, huh? What should I try next?

    Read the article

  • Web App Server hardware question. Which configuration?

    - by JBeckton
    I am pricing some new servers and I am not sure which configuration to get. The server will be running several web applications for our company. Some of them are ASP.Net sites and some are ColdFusion. The OS will be Win Server 2008 Web or Standard Edition. Do I need 2 processors or will a single quad core handle it? Xeon multi core Hyperthreading or non Hyperthreading? I am going 64bit so I can go higher than 4 Gigs of Ram. I am shopping at Dell and there are so many options, I want to get the most bang for my buck without going over budget and I also don't want the machine to be mostly under utilized.

    Read the article

  • How to register putty public key on windows server 2003?

    - by igarren
    So we were trying to transfer files from unix server to a windows server 2003. I currently don't have any visibility on the unix server since it is being handled by another team. So in order for the 2 servers to establish connection, they sent us a public key which they said that we need to register on our windows server 2003 machine. can anyone help me? i cant seem to find anything about registering this public key generated by putty on a windows server 2003. there's no directory to put the public key(like in linux authorized_keys). Any help will be appreciated. thanks in advance. EDIT: we're trying to transfer file via pscp if this info is needed.

    Read the article

  • Asp.net error messages when on server are not displayed

    - by asn187
    I have been tasked with setting up asp.net websites on a windows server 2008 which are all in debug mode When browsing a website on the server and an error occurs, for example the database connection cannot be open I would expect as per normal to receive the Asp.net Server error page with an error dump Something like - http://www.codeproject.com/KB/books/1861005040/image091.gif However, what actually happens is I get random characters on the web page. For example: <?)=????*??2o????v??YK?WuZ,?6[N??f?O??b??@!???u]S??yQ?iN?&e???E???j??1z??x??????o?y????U??M???2d?i?4 This is not the correct or expected behaviour. The event log does however show what has gone wrong. How do I get the Server Error page to render properly, am I missing something in the servers asp.net setup?

    Read the article

  • XenServer VM's lose network settings

    - by Ash
    We deploy virtual machines using Citrix XenServer 6.0 for our clients. Two seperate clients experience the same issue: when a Server 2008 virtual machine is restarted, the static IP addresses (network address, subnet, gateway, primary DNS) don't appear to apply correctly as the IP's cannot be pinged, network services cannot be accessed etc. The issue is resolved by manually switching the network adapters to DHCP, then re-setting them to the original static IP's. While not a major issue, it's a pain when restarting servers due to Windows Updates, plus iSCSI drives need to be manually connected to Windows again via iSCSI Initiator. We have tried removed the network adapters from the virtual machine under XenCentre but without luck. Anyone experienced similar issues?

    Read the article

  • Database modularity with EBS volumes

    - by Eclyps19
    I would like to add modularity to my websites on EC2 instances by encapsulating the site files and the mysql files in their own EBS volumes. The end result that I'm going for is the ability to quickly mount a volume or two to different servers running the same AMI (for testing/development/emergency maintenance, etc), as well as maintain separate snapshots of each. I'm able to do this fairly easily with a single database by symlinking my mounted database EBS to the appropriate places (/var/lib/mysql, /etc/my.cnf, /var/log/mysqld.log), but I'm not sure if it would even be possible be possible to have multiple databases on different EBS volumes running concurrently. Example: /website1/www.website.com /database1/ /website2/www.otherwebsite.com /database2/ Could anybody shed some light on this for me? Is it possible? Is it a bad idea? Thanks.

    Read the article

  • How to automate kinit process to obtain TGT for Kerberos?

    - by tore-
    I'm currently writing a puppet module to automate the process of joining RHEL servers to an AD domain, with support for Kerberos. Currently I have problems with automatically obtain and cache Kerberos ticket-granting ticket via 'kinit'. If this were to be done manually, I would do this: kinit [email protected] This prompts for the AD user password, hence there is a problem with automate this. How can i automate this? I've found some posts mentioning using kadmin to create a database with the ad users password in it, but I've had no luck. Thanks for input

    Read the article

  • Microsoft Outlook x Lotus Notes in enterprise environment

    - by Ladislav Mrnka
    I'm working in company (several thousands client computers) using tools like MS Outlook + Exchange + Sharepoint + Communication server. Now our mother company shared the idea that everything should be moved to IBM Lotus Notes + Domino. The reason for this is that Outlook based solution is too expensive (it is the official statement even they don't want to replace MS Office yet). Other reason probably is that they received a lot of licences for Lotus when they bought IBM servers to new data center. My question is: Is this reasonable change or is it just some management game and IBM's marketing? Will it really save money? Other question: Is Lotus better then MS solution? This is serious question, it is not mentioned to open any kind of flame war. I just don't believe the decission and I have never used Lotus tools.

    Read the article

  • help with Outlook Exchange server and curl

    - by stib
    I work on a mac in a building full of PCs, and the IT department here doesn't have IMAP access turned on on the exchange servers. So I miss a lot of meetings because I don't get reminders because I access my mail via Outlook Web access. I had written a script to scrape my Outlook Web Access calendar and turn it into iCal format, so I could get my reminders via thunderbird or iCal.app. It basically downloaded the calendar page via curl, parsed the HTML and reformatted all the appointments as ical. it wasn't elegant, but it worked. Then they changed to outlook 2007, and it doesn't work any more. I have a sketchy knowledge of curl, and almost zero knowledge of how outlook works. Can anyone point me towards a reference for getting calendar info out of an exchange server without using outlook? If I can configure curl to get the HTML I will be happy, but if there's a more elegant way, such as getting the calendar info as XML I'll be delirious.

    Read the article

  • Gitweb showing opposite colors for added and removed text

    - by Maddy
    Hi, I have installed gitweb in our servers. And it started showing the branches and the commit diffs. But the syntax highlighting is opposite particularly for added and removed text. Supposed added text should be in green and removed text should be red. But I am seeing an opposite one. I can hack gitweb.css to get my job done. But felt like knowing why such issue is happening? And what might be the proper fix. (if any one knows good themes for gitweb? please mention)

    Read the article

  • Increase data transfer speed through bonding/lacp?

    - by Matteo
    I want to maximize the throughput of a data transfer between two servers. The copy will be made at the application layer using Robocopy. To clear things up, please check my Visio schema of the network: FS1---------(SW1)===========(SW2)--------- FS2 SW1 to SW2 is connected through 10 gigabit Fiber Channel ethernet FS1 to SW1 is connected through 1 gigabit ethernet FS2 to SW2 is connected through 1 gigabit ethernet I first idea I've come up with is to use LACP, so I could use two Gigabit Ethernet between each server and the switch. A collegue told me that LACP is for availability and not performance, so he reckon this solution will not work. Is he right? Do I have other options? Thank you very much

    Read the article

  • Puppet: how to use data from a MySQL table in Puppet 3.0 templates?

    - by Luke404
    I have some data whose source-of-truth is in a MySQL database, size is expected to max out at the some-thousands-rows range (in a worst-case scenario) and I'd like to use puppet to configure files on some servers with that data (mostly iterating through those rows in a template). I'm currently using Puppet 3.0.x, and I cannot change the fact that MySQL will be the authoritative source for that data. Please note, data comes from external sources and not from puppet or from the managed nodes. What possible approaches are there? Which one would you recommend? Would External Node Classifiers be useful here? My "last resort" would be regularly dumping the table to a YAML file and reading that through Hiera to a Puppet template, or to directly dump the table in one or more pre-formatted text file(s) ready to be copied to the nodes. There is an unanswered question on SF about system users but the fundamental issue is probably similar to mine - he's trying to get data out of MySQL.

    Read the article

  • Remote Desktop app can't connect through VPN or through RDP load balancer

    - by nhinkle
    Using the regular Remote Desktop Client (in the desktop environment) I can connect just fine to remote servers when connected through Cisco VPN or when accessing a server behind a load balancer. When using the Remote Desktop app in the Modern UI, I can't do either of these things. Trying to connect to a remote server that's on a private network fails with: Can't find server, make sure the name and domain are correct and try again And connecting to a server that's behind an RDP load balancer fails with the following error, after accepting credentials: Because of a protocol error, this session will be disconnected. Please try connecting to the remote PC again Is there some way to use the Remote Desktop app in these situations, or am I just out of luck?

    Read the article

  • RPC Server Unavailable When Trying to Join W2003 Server to W2003 Active Directory Domain

    - by Roel Vlemmings
    I have an Active Directory Domain with a Windows 2003 Standard SP2 Server as the DC. When trying to join an additional Windows 2003 Standard SP2 server to the domain I get message "The following error occurred attempting to join the domain 'My Domain'. The RPC Server is unavailable. The computer is actually added to the Active Directory Computers. I can even right-click and Manage it. I can access file shares from the DC on the other server and vice versa. I can ping the DC from this server and ping the server from the DC using the computer name. The time on both server is the same more or less to the second. RPC service is running on both servers. I can join other computers to the domain and there are no other issues with the domain. Windows Firewall disabled on both computers. NetSetup.LOG shows: NetpSetNetloginDomainCache: DSEnumerateDomainTrustsW failed 0x6ba I looked up this WIN32 Error Code: It is RPC_S_SERVER_UNAVAILABLE.

    Read the article

< Previous Page | 346 347 348 349 350 351 352 353 354 355 356 357  | Next Page >