Search Results

Search found 13411 results on 537 pages for 'proxy servers'.

Page 375/537 | < Previous Page | 371 372 373 374 375 376 377 378 379 380 381 382  | Next Page >

  • What to have in sources.list on an Ubuntu LTS server (production)?

    - by nbr
    I have several Ubuntu 10.04 LTS servers in production and I'm using apticron to check that my software is up to date, security-wise. However, by default, Ubuntu has the lucid-updates repository enabled. This means lots of low-priority updates (such as this) that I don't need and thus, extra work for me. Is it okay to just remove the lucid-updates line(s) in sources.list? I still get security updates via lucid-security, right? So, this is what my sources.list would look like. deb http://se.archive.ubuntu.com/ubuntu/ lucid main restricted deb http://se.archive.ubuntu.com/ubuntu/ lucid universe deb http://security.ubuntu.com/ubuntu lucid-security main restricted deb http://security.ubuntu.com/ubuntu lucid-security universe

    Read the article

  • Apache httpd + FreeTDS hangs until restarted

    - by Jordan Reiter
    Every so often requests to a Linux server (say, linux.example.org) where the web app (Django) pulls in data from a SQL Server database via FreeTDS will hang. Requests on other servers pointing to the database still work, as do requests on linux.example.org that use local MySQL databases. Only the server plus FreeTDS appear to be affected. Restarting httpd makes the database connections work correctly again. What could cause this problem? Using: Centos 5.9 freetds 0.91 Apache httpd 2.2.3 /etc/obdc.ini: [DSN] Description = SQL Server 2005 Driver = FreeTDS ;Database = dbname Servername = SERVERNAME ;TDS_Version = 8.0 /etc/freetds.conf: [SERVERNAME] driver = /usr/lib64/libtdsodbc.so host = db.example.org port = 1433 tds version = 8.0 client charset = UTF-8

    Read the article

  • can not access dlink 604 set up interface

    - by user36089
    Hello everyone I used dlink-di604 enthernet board as router to share web access. My ISP provides the service base on Ethernet rathern than base on Ethernet pppoe mode. It is manually setup ipv4, subnet mask , DNS, Gateway etc Log in using web user name&password. I use http://192.168.0.2 try to access dlink di604 setup inferface, but failed I call command ipconfig /all Dos shell displayed: Ethernet adapter Local Connection: Physical Address: 00-3c-56-79-19-49 IPv4 address:10.7.8.225 subnet mask: 255.255.255.0 default gate way: 10.7.8.1 DNS servers 10.10.10.10 What is the correct way to access dlink 604 setup interface and set to share web access? Welcome any comment Thanks interdev

    Read the article

  • What would cause SQL 2008 Log Reader Agent to fail with "This process could not execute 'sp_replcmds

    - by Rick
    I've seen this error message in other posts. They didn't seem to help resolving our issue. We are trying this with two SQL Server 2008 servers. I backed up my database from the source server and then restored it on our destination server. We setup basic Transaction Replication. The Snapshot Agent is working fine. The Log Reader Agent fails with the error above. Is it most likely a login issue for this job or QueryTimeout?

    Read the article

  • ftp users configuration in OpenSuse 12

    - by chieroz
    I usually work with MacOSX servers but this time I need to set up a ftp service on a OpenSuse 12.2 server and I am a little lost. I am using the remote YAST2 tool via ssh. I created several users who can connect via ssh and/or ftp, so the basic setup is ok. But when connecting via ftp all my users don't have write permissions. The FTP directory for authenticated users is /srv/www/htdocs, which has permissions root:root. The OpenSuse manual say it's bad practice to change these permissions, but my normal users (even the ones in the sudoers list) cannot upload files. So I am stuck: as a workaround I use rsync, but from time to time I just need to establish a working ftp connection. What's the right approach for users permissions in this scenario? Thanks a lot.

    Read the article

  • How do I know if I need to backup locally stored emails?

    - by Sometimes
    I am moving a friend's website and emails from the current server to a new one. I don't have much experience working with migrating emails and in the past when moving servers all the emails have disappeared from the users local inbox, eg. MS Outlook. To make my question more clear, How do I know if I have to backup the emails before moving server? as I know sometimes they are stored locally and sometimes they are not. And, how do I know if the emails will remain on the user's machine once I move the information from server to server?

    Read the article

  • Is it possible to host a web server from behind a NAT

    - by iamrohitbanga
    My PC is behind a NAT router that has a public IP address. If I want to host a website then I believe I need a domain name which I can purchase from some site which would pledge to resolve all DNS requests for that domain name and send the IP address of my NAT router (assuming I do not want to host my domain name on their servers). Now I want to host a web server on my computer. What changes should be done to the NAT router's configuration to forward all HTTP requests for example.com to my PC in the internal network. Is the above strategy correct? Is it commonly used?

    Read the article

  • Logging the client IP with Nginx/Varnish/Apache

    - by jetboy
    I have Nginx listening on port 443 as an SSL terminator, and proxying unencrypted traffic to Varnish on the same server. Varnish 3 is handling this traffic, and traffic coming in directly on port 80. All traffic is passed, unencrypted, to Apache instances on other servers in the cluster. The Apache instances use mod_rpaf to replace the logged client IP with the contents of the X-Forwarded-For header. My problem is that if the traffic is coming via Nginx, while the 'correct' client IP is getting logged in the VarnishNCSA logs, it looks as if Varnish is (understandably) replacing Nginx's X-Forwarded-For header with 127.0.0.1 downstream, and this is getting logged with Apache. Is there a nice simple way to stop Varnish rewriting X-Forwarded-For if it's already populated?

    Read the article

  • Software for defining rules for folder permissions and monitoring deviations

    - by Kjensen
    Let's say a company has a large number of users, and each user has a home area. On each share used for home area folders, I would like to define some rules saying who is supposed to have which permissions on the folder. Then I would like to audit automatically, that this is actually the case and get some sort of report on deviations. So a rule for \MegaServer\Home01 could be defined something like: Domain Admins - Full Control Backup Agent - Read [Home folder owner] - Full Control I am talking about Windows platform and Windows servers, although I think it would most likely also work for *nix machines that expose Windows shares. Does software like this exist? I could roll my own basic version, but if something already exists, that is usually a better option. I am aware of tools to make displaying permissions easier (AccessEnum, DumpSec), but that is not what I am looking for.

    Read the article

  • backup, sync and search files over internet and intranet

    - by Cawas
    There are many online backup options out there. Dropbox, Sugarsync, Mozy, Carbonite, Jungledisk and my favorite so far, Crashplan. Some of them allow searching, all of them sync with their online servers, none of those (or many many others I didn't listed here) have what I want. I'm _not_ looking for an online backup service in here. Sure, some people might say "use rsync", "linux" and/or "set up apache" and so on... But that's just too much for maintenance, if it's even viable of building up. It needs to be simple. So, anyone knows of a really good solution out there? Picture mostly Google Desktop Search (or quick search) awesome searching, mixed with Crashplan Desktop, which is able to do everything by itself, and something like Dropbox's file versioning, along with dropbox the ability to seamless sync over intranet and internet like crashplan, switching between them when needed. I bet there's nothing like this yet, but well, I'm not sure. It would be great!

    Read the article

  • IIS 6.0 mitigating BEAST

    - by D3l_Gato
    Recently, my PCI assessor informed me that my servers are vulnerable to BEAST and failed me. I did my homework and I want to change our webservers to prefer RC4 ciphers over CBC. I followed every guide I could find... I changed my reg keys for my weaker than 128bit encryption to Enabled = 0. completely removed the reg keys for the weaker encryptions. I downloaded IISCrypto and unchecked everything but RC4 128 ciphers and triple DES 168. My webserver still prefers AES-256SHA. Is there a trick in IIS 6.0 to get your webservers to prefer RC4 ciphers that I am not figuring out? It seems like in IIS 7 they made this very easy to fix but that doesn't help me now!

    Read the article

  • How can I restrict the backuppc client user as much as possible? (rsync)

    - by jxn
    I have backuppc making full backups of servers, but I'd like to be sure that my set up is as paranoid as possible. BackupPC is set up to backup via rsync, and it is set up to use a specific user on each client to be backed up. Because the backuppc client user has to have access to every file on the client machine and the ability to ssh into the machine without an interactive password, I'm a little nervous about securing the clients, and I'd like to know I haven't overlooked any options. Here's what I have in place: in the client user's authorized_keys file, i've included from="IPTOSERVER",command="/usr/bin/rsync" before the user's public key, so that the user can only login coming from the BackupPC server. Next, in the sudoers file, I've added this line: backuppc ALL=NOPASSWD: /usr/bin/rsync to allow root-level permissions only for the rsync command for that user. Are there other user, policy, or ssh restrictions that I can add while still allowing the backup pc client user to rsync all files?

    Read the article

  • Backup Solr home

    - by user226188
    I'm new to Solr: I've successfully installed Tomcat and Solr 4.3.1 webapp, and two collections on a CentOS 6.4 machine. Now, my server is in production and I need to make backups of solr. So, I would like to know what is the best way to backup solr... For the moment I'm dooing: stop tomcat = tar of my solr home = start tomcat, but I've read that is not a good solution? Moreover, this implie to stop all the tomcat which have other webapps than solr. I've also heard that there is a script named "backup" in solr home bin's folder ? but my bin folder is empty :( I don't want to make an another slave server with replication, for me it's not a backup solution because my backup are supposed to be send to a bacula backup server all nights. There is no builtin solution that I can work around to make a script ? like a mysqldump for Mysql servers. Thanks for help !

    Read the article

  • Can't browse computer via nlb cluster name

    - by peg_leg
    I have a fileserver nlb cluster, currently set to single affinity, made up of 2 2008R2 servers. We switched the primary node today. Now our Windows XP workstations can't browse to the cluster name (i.e. \fileserver) but can browse to the cluster ip address (i.e. \192.168.1.1) and can browse the member server by name (i.e. \filesvr1). I remember having a similar issue when we had to change a registry setting to allow Windows XP boxes to see another file server that was in a failover cluster but had to be referred to by another name (\thisfileserver instead of \fileserver). Convoluted, for sure, but it helped to prevent any code changes from happening. Well all of the programmers have their code on \fileserver and we can't have them switch their links every time \filesvr1 supercedes \filesvr2 or vice versa. I can't remember that registry setting that allowed the file server to ignore that it's being called by the wrong name. HELP!

    Read the article

  • Strange PHP output buffering

    - by radek-k
    PHP: header('Content-type: text/plain'); for ($i=0; $i<10; $i++){ echo "$i\r\n"; ob_flush(); flush(); sleep(1); } I tried script above on 2 different servers. Both respond numbers 0...9 in every line. In case of first server each number is received every second. In case of second server there is no output for 10 seconds and entire output is displayed at once. What might be wrong int second case? I tried various uutput control Functions but it didn't help. Set of response headers in both cases is pretty much the same: HTTP/1.1 200 OK Date: Mon, 03 Jan 2011 19:21:21 GMT Server: Apache X-Powered-By: PHP/5.2.14 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Transfer-Encoding: chunked Content-Type: text/plain

    Read the article

  • How To Send Email to Active Directory Group?

    - by BGM
    Salvete! I have two servers, one hosts my email server (hMailServer on windows server 2003) and the other hosts active directory (on windows server 2008). I don't have Microsoft Exchange. In Active directory, there are user groups that have email addresses. How can I send an email to a user group? Somewhere I need to be able to connect my mail server to active directory. Maybe AD has a mail pickup folder? I can't find the information that I need. Here is a similar link, but it didn't help me. Send As Distribution Group Email Address? (I think a tag for hMailServer would be a good idea.) Thanks for y'all's help.

    Read the article

  • Exchange 2010 Autodiscover/OAB update issue

    - by bulldog5046
    Mid way through migration from 2003 to 2010 and with a few test users on 2010 i've noticed that the OAB is not being downloaded to outlook clients. I've checked the URL's are configured, addded both our CAS servers to the web based distribution list for the OAB and assigned the OAB to 2 mailbox databases we use but when i use outlook 'Test E-Mail AutoConfiguration' test i still see that the autodiscover says "OAB URL: Public Folder" even though i've now deselected the option. I've ran Test-OutlookWebServices to which i was getting an OAB error about no URL in the autodiscovery but having just re-ran it now appears fine, yey the autoconfigure test still does not. Does anyone have any idea why i'm getting this discrepency?

    Read the article

  • Possible to redirect from HTTPS to HTTP behind load-balancer?

    - by Derek Hunziker
    I have a basic ASP.NET application that sits behind an F5 load-balancer. Incoming SSL requests (over HTTPS) terminate at the load-balancer and all internal communication between the load-balancer and my application servers is unsecure (over HTTP). When a unsecure request comes in, my app is able to use Response.Redirect("https://...") to redirect a secure URL with no problems. However, the other direction appears to be impossible - I cannot redirect from HTTPS to HTTP using Response.Redirect() from my application. The URL remains HTTPS for the client and does not change. Could the F5 be preventing the redirect for ever reaching the client? Is there any special configuration necessary to let this happen?

    Read the article

  • Windows IPSec computer authentication using *user* account?

    - by Tim Brigham
    For some reason every once in a while it happens that my IPSec authentication is from a user account to a computer account, not computer to computer. How can I fix it? Sometimes - notably when I try to add a new workstation through ePO but it's happened other times as well I'm getting strange behavior from my Windows Advanced Firewall IPSec. This causes the authentication to be invalid (as the group memberships, etc all assume computer accounts). I have no idea why this is happening or how to fix it but the IDs to match up between servers (the opposite server in my second example has remote ID timb).

    Read the article

  • How can I easily confirm in Linux that two separate directories have the exact same contents?

    - by Mike B
    CentOS 5.x Mq question seemed similar to this one but I wasn't sure... I have two servers (completely isolated from each other), each with a directory and sub-directories that should have the same exact contents. For example the directory layout could be something like: SERVER A - /opt/foo/foob/1092380298309128301283/123.txt /opt/foo/foob/5094380298309128301283/456.txt /opt/foo/foob/5092380298309128301283/789.txt /opt/foo/foob/1592380298309128301283/abc.txt SERVER B - /opt/foo/foob/1092380298309128301283/123.txt /opt/foo/foob/5094380298309128301283/456.txt /opt/foo/foob/5092380298309128301283/789.txt /opt/foo/foob/1592380298309128301283/abc.txt Ideally I'd like a way to do a recursive check and have something confirm that everything matches. I also want to avoid using any third-party tools. Any ideas?

    Read the article

  • Windows: Should I install Server or stick with regular?

    - by stalker92
    I hope somebody can help me solve my dilemma. I have my home PC (using Windows 7) which I use for both work and leisure (gaming, surfing, movies etc.) I tend to never turn it off, only when I must reboot because some installation requires me to or when the power gets lost. But, sometimes Windows starts acting weird (usually after the long period of system uptime), per example eats up randomly all the space on my system partition etc. which is solved after the reset by itself. I was thinking to switch to Windows Server, I guess that it is more optimized for long uptime, well, obviously it is meant for use on servers. Can somebody with more experience with this help me decide is it worth it, will it solve these issues connected with long uptime periods? Thanks in advance.

    Read the article

  • How to use radiusclient-ng?

    - by Muhammad Gelbana
    A guy on my team compiled the radiusclient and radlogin executable found on that page. But installing it is getting more and more problematic and I can't seem to get anywhere ! I received from him: radclient libfreeradius-client.so.2 servers radiusclient.conf dictionary.dat radlogin What I'm trying to do is to install this client on a linux box and the: Access that box remotely using ssh. Then issue a authentication\accounting requests to another remote RADIUS server. But nothing seems intuitive about this and I have very little experience with linux and RADIUS protocols ! Has anyone successfully installed that client ? Thank you.

    Read the article

  • How to achieve redundancy across data centers?

    - by BrandonBT
    I have a LAMP server with a lot of hardware redundancy built in. I am not worried about the server becoming unavailable. What I am worried about, however, are potential network issues in the data center the server is in. What I would like to have is another server in another data center for redundancy. Load balancing is less of a concern. With that said, I am relatively clueless on two points: How to have two servers in two geographically separate data centers that have exactly the same data, in terms of both files and MySQL databases. How to ensure that all traffic coming into one data center are automatically transferred to the other database in the case of a network or server failure at the first data center. Any guidance on how to accomplish the above two problems would be greatly appreciated.

    Read the article

  • bad switchs duplicate my ip

    - by tacoen
    I had a large area LAN. There were many switch and AP on it, then somehow I couldn't ping my servers, and it's said that the IP was duplicated. I use arpwatch and found out that one of the switch flip-flop-ing the IP. I isolated that troublesome switch using his mac-address. But, since this a large area LAN... I doubt this will be the last cases. If there any software or hardware that I can use to prevent this kind of error? Sorry for my bad English.

    Read the article

  • Use Google Apps/Cloud Services as a Domain Controller Replacement

    - by user124548
    This is a Canonical Question about Cloud Services replacing Active Directory. Is it possible to use Google Apps or another Cloud Service as a replacement for a Windows Domain Controller (replacing my whole AD infrastructure)? Specifically, I want to remove our dependence on a local Windows Server; currently it acts as a Domain Controller with File and Print Services. I'd like to seamlessly replace this server with something based on hosted applications. I do not just want to move the server to a dedicated or collocated server. I have yet to figure out how to piece together printer/etc sharing. If anyone has any insight into this, it would be appreciated. The goal is to eventually move all my servers to the cloud then write up a case study on the whole affair.

    Read the article

< Previous Page | 371 372 373 374 375 376 377 378 379 380 381 382  | Next Page >