Search Results

Search found 29101 results on 1165 pages for 'open basedir'.

Page 937/1165 | < Previous Page | 933 934 935 936 937 938 939 940 941 942 943 944  | Next Page >

  • Want to use something like Citrix XenDesktop, Free Alternative?

    - by Chris
    I'm looking to go into IT, general office server management, and it looks like XenDesktop would be a awesome tool to use. If I get it right, you would store a central image of the OS you want to deploy (in an iso file) on the main server. Then use XenDesktop to pull that image down to the client, and it will then boot the OS inside of the virtual machine. Does it download the image of the OS and store it locally (like cloning the VM onto the client?) I'd love to find a free (possibly open source?) alternative to this, I keep on hearing about KVM in Linux and PXE booting a minimalistic OS to use remote KVMs.... Would that be what I'm looking for? Ideally, I'd like a system.. - That allows me to manage one central image for multiple clients (virtualized hardware) - Easily boot a thin client OS that connected to XenDesktop. Would those things be possible with some kind of free alternative? Some guidance would be greatly appreciated.

    Read the article

  • Restricting Access to Application(s) on Point of Sale system

    - by BSchlinker
    I have a customer with two point of sale systems, a few workstations and a Windows 2003 SBS Server. The point of sale systems are typically running QuickBooks Point of Sale and are logged in with a user who has restricted permissions / access (via Group Policy). Occasionally, one of the managers needs to be able to run a few additional applications -- including some accounting software. I have created an additional user for this manager, allowing them to login and access the accounting software. The problem is, it can be problematic to switch users on the system, as QuickBooks takes a few minutes to close (on POSUser) and then reopen (on ManagerUser). If customers are waiting, this slows things down drastically. Since the accounting software is stored on a network drive, it would be easiest if the manager could simply double click something, authenticate against the network drive / domain controller and then the program would launch. When they close the program, the session to the network drive would be lost and the program would no longer be accessible. Is there any easy way to do this? Both users are on a domain and the system is Windows 7. I just don't want to require the user to switch back and forth. In a worst case scenario, they forget to switch back and leave the accounting software wide open.

    Read the article

  • Website Use Monitoring for 3 People

    - by linkedlinked
    I work in an IT startup with 2 partners, and I'm the programmer/IT guy -- in other words, the work horse. To make a long story short, I'm doing most of the work right now, while they spend all day on Facebook. That's OK, because they're paying my salary, but if the project fails, I'm sure they'll blame me for it (I'm doing my best to make sure that doesn't happen!), and I want some sort of recourse. I already have an app that blocks time-wasters on my local PC, and keeps logs of when the app is enabled (so I can say "I had Facebook blocked from 9am-5pm today.") Is there any way I can get a brief summary of the most heavily visited sites, split up by client PC? At the end of the month, I want to be able to say "You both load Facebook, on average, every 10 minutes. You spend hours a day on Youtube, and haven't opened up our bugtracker in weeks" and maybe have a nifty chart or graph to match it. We have a crappy D-Link router, and no IT budget. They are both on Windows Vista, I run Ubuntu Linux. I don't want to install any monitoring software on their PC, but I'm totally fine with, say, routing all the network traffic through my machine. I guess I can think of lots of ways to accomplish this (telnet into JSSH and list open tabs? log all the DNS requests, per-domain? even thinking of setting up a webcam on my desk and just keeping 5-minute snapshots...), I just don't really know where to start. Any advice is appreciated, thanks!

    Read the article

  • .htaccess redirect to error page if port is not 80

    - by Momo
    I'm running a portable server through usb stick. The thing is I also have WAMP installed in my local machine and Apache somehow gets started on windows startup, because of some random reason which I don't recall now and it can't be changed. I want to prepare my portable server in situations like this, so closing httpd.exe from process and starting my portable server is not an option. Anyway, because of already active httpd.exe my portable server's WordPress site can only be accessed through localhost:81 - this is a problem as WP site is very dependent on the URL and I don't want to include the url with port on WP database. Here is what I want to do through .htaccess: On any path except for error.php file check if not port 80 If not port 80 redirect to /error.php?code=port It it possible for it to have priority over WP redirection or URL handling? In the error.php I provided info on how to manually close httpd.exe and such so my family and friends can access the portable site. It's sort of like a gallery and calender application for events and other such stuff... Please help? I'm I can't figure it out at all. I know others may not have apache already running, but I want to prepare for such a situation. Something like the following, but the following doesn't work. # BEGIN WordPress <IfModule mod_rewrite.c> <If "%{SERVER_PORT} = 80"> RewriteEngine On RewriteBase / RewriteRule ^index\.php$ - [L] RewriteCond %{REQUEST_FILENAME} !-f RewriteCond %{REQUEST_FILENAME} !-d RewriteRule . /index.php [L] </If> <Else> RewriteEngine On RewriteRule ^(error.php)($|/) - [L] RewriteRule ^(.*)$ /error.php?code=port [L] </Else> </IfModule> # END WordPress By the way, the portable server Server2Go automatically generates vhosts based o the hostname set on it's config file and changes ports if the port (e.g. 80) is already open.

    Read the article

  • Need help using a super scope

    - by Vdub
    I have a windows server 2008 r2 standard running our DCHP, DNS, and AD. also I have (3) HP Pro Curve 2510-G switches (J9280A). Right now our LAN is set up 192.168.50.2-192.168.50.254 on our sub-net (A) and another scope with 192.168.51.2-192.168.51.254 sub-net (B) both have sub-net mask of 255.255.255.0. The same server is our DNS which is 192.168.50.242 and our firewall (watchguard) is the gateway at 192.168.50.1. Right now the sub-net (B) does not have DHCP active so only sub-net (A) is giving a pool. My problem is that we are trying to have open WiFi on our network and i am assuming that i can use the sub-net (B) for that if i activate it and use sub-net (A) for our staff only. I have noticed that when i set up a static on a client pc and set it to 192.168.51.x i cannot use the DNS of 192.168.50.242 however i can use 8.8.8.8 and it works fine, i am guessing that because it is on a different sub-net? Forgive me as i am very new at this and dont know a lot. Is there easy way with the equipment i have to a accommodate wifi for hundreds of people without causing problems for our staff? (multiple same IP address assigns) I appreciate any and all info!

    Read the article

  • Logitech QuickCam Pro 9000 & Windows 7 64-bit failing miserably

    - by Saxtus
    I am trying to install a Logitech QuickCam Pro 9000 webcam to Windows 7 64-bit. If I do it without using the Logitech drivers but instead the Windows Update ones, the camera works with low frame rate and without face tracking and all other bells and whistles that it's full driver provides. The moment I install the latest official Logitech driver, the problems begin: Camera works fine, until I decide to go to audio settings of the LWS panel or Windows'. Then LWS freezes and with it everything that tries to output audio. I am not able to open Playback/Recording devices window (it just doesn't appear) and system gets unstable and slow with LWS.EXE process not been able to close forcefully. If I reboot and forget the camera connected, this situation continues and system gets unstable from the beginning. If I reboot without the camera connected, everything works fine until I connect it and try to do something with audio settings of Windows or LWS panel. I should note, that until the freezing occurs, camera works as expected, with full frame rate, face tracking and everything that is expected to do. The soundcard is the ASUS SupremeFX II of the ASUS Striker II Extreme motherboard. Any ideas of what is causing this or what else to try so I can make it work as advertised? Thank you.

    Read the article

  • My Computer hangs for a few minutes just after startup, and then is fine.

    - by EvilChookie
    So I just built myself a reasonably beefy computer, and I installed Windows 7 on it. However, I start the machine up each morning and within a few minutes, the computer will semi hang. That is the mouse is responsive, and most of the time I can open task manager, or a new tab in Chrome. Occasionally windows will be labelled as 'Not responding'. Then, the machine will get over it's problem, and will be nice and quick until I turn it off. Here's my specs: CPU: AMD Phenom-II X4 955 Black (Quad Core, 3.2ghz) RAM: 4GB of DDR3 1300 MOBO: ASUS M4A785T-M (Latest BIOS) HARD DRIVES: 2x1TB Western Digital Caviar Blacks in RAID-0. OS: Windows 7 Ultimate x64. GPU: ASUS GT240 1GB. I believe this issue relates to the RAID array, as I didn't have the lockup problem before I created the array. I purchased a second drive and reformatted after creating a RAID array, since the single drive was a little on the pokey side (compared to the rest of the computer). What I have tried: Updated Raid Drivers Malware checks Windows Updates Unecessary Services CPU and Disk activity appears to be low (via Resource Monitor) No strange errors in the error log. Any thoughts?

    Read the article

  • Cannot connect to remote mail server for sending emails in ASP.NET

    - by Dave
    I want to migrate a web application from a Windows Server 2003 to a Windows Server 2008 R2. All works fine except sending emails from the application. If I configure the application to use the smtp server on "localhost" it works, but changing it to the "real" host name (e.g. mail.example.org) no mail is sent. The error message says, that the remote server needs a secure connection or smtp authentication. But since it works when using "localhost" instead of the host name I doubt that this is the problem. Also it's unlikely a problem with the mail server, I also tried it with another one. So for me it seems like the firewall is blocking the outgoing connection to the mail server. I tried to open port 25, but it still did not work. Maybe I just did it the wrong way. Update: For clarifying my setup: I have a Windows Server 2008 R2 with hMailServer installed (set up for some of the hosted domains) For the website I'm talking about I need to use an external mail server (totally different hosting provider) Apparently I was a bit off the track. It seems like it works when using connecting to the local mail server either with the host name "localhost" or "mail.somedomain.com" (while somedomain.com is set up in my mail server). But when using the host name of the external mail server ("mail.externaldomain.com") it seems like it tries to connect to the local server again, although this domain is not set up in the mail server. Thanks to Evan Anderson for the tip to use telnet - why I have not thought of it myself?... :-) Note, the website www.externaldomain.com is hosted on my server but the DNS entries are maintained by the other hosting provider. "externaldomain.com" is the only entry which points to my server all other records (MX, subdomains) are pointing to the other server. So I think the question is now, how do i bring my server to connect to the external mailserver. Do I have to configure this in my mail server or is it a windows server thing?

    Read the article

  • Avoid "privacy pitfalls" in Windows and Linux?

    - by Somebody still uses you MS-DOS
    I have a Windows and a Linux machine. In Windows, everytime I visit a site, a lot of cache/history files are created on my machine. I setup my Firefox to don't save anything. ...but Windows saves a lot of "temp" files, some strange files I opened in registry (like video names). Each video I open in VLC is shown in "Last shown videos". In windows, all files opened can be found at "Recent opened files" as well. A lot of these privacy configurations can be tweaked (VLC and "Recent opened files" in Windows) - it's a PITA doing it individually, but it's possible - but there isn't a guide to these "internal" privacy traces that are left on Windows installation. In Linux, I just know there are these problems in app level (like VLC). My question is: is there a complete guide to avoid undesirable traces of what I did/watch/used in my Windows machine? (Delete everytime the PC is restarted, or even avoiding recording these info at all) Is there a website with configuration guides to different types of software? I would like to know about Linux privacy pitfalls as well.

    Read the article

  • SSL timeout on some sites, across all browsers, on Mac OS X Snow Leopard

    - by dansays
    For the past several weeks, I've been receiving "Error 7 (net::ERR_TIMED_OUT): The operation timed out" when I attempt to connect to either Twitter or Paypal via SSL. I get this specific error in Google Chrome, but the same problem occurs in both Safari and Firefox. Other sites work fine, and other computers on my network can access these two sites. I have no firewall settings that would prevent me from accessing these sites over port 443. I notice that both Twitter and Paypal both have "Verisign Class 3 Extended Validation SSL CA" certificates. It is unclear whether this is related to the problem. In an effort to troubleshoot, I attempted to open the test sites referenced on Verisign's root certificate support page, which worked fine. Just to be sure, I downloaded and installed the root package file and installed all included Verisign certificates. No joy. I feel like I've hit a dead end. Any ideas? Update the first: I also cannot connect to FedEx.com, who also has a Verisign Class 3 Extended Validation cert. Update the second: Aaaaaaand it fixed itself. I did nothing. Or, I did something that worked, but in a delayed fashion. Frustrating, but a win is a win. I'll take it.

    Read the article

  • Recommendation on remote access setup for accessing customer systems

    - by gregmac
    I'm looking for a product recommendation (open or commercial) that will allow remote access to customer sites for tech support purposes. We need to be able to gain access to help troubleshoot problems on servers. Currently end up using anything from RDP on public IP, to various VPNs that clients happen to have, to webex-type sessions that require lots of interaction from both sides to get things working. This often means a problem that could take 10 minutes to solve takes an extra 30+ minutes messing around trying to get a connection up. There are multiple customer sites, which should NOT have access to each other. At each site, there is anywhere from 1 to 8 servers (Windows 2003 or 2008) that need to be accessed. Support connection to machines even if they're behind a firewall/router with no public IP Be able to selectively allow/deny access from customer site. Customer site should not be able to connect outbound to anywhere else (our systems, or other customer sites) Support multiple users from our end If not a VPN connection (where RDP could be used over top), should support: Remote desktop access, including copy/paste File transfers Preferably would have some way to list all remote systems, showing online/offline. Anyone have any suggestions?

    Read the article

  • Backup script to FTP with timed subfolders

    - by Frederik Nielsen
    I want to make a backup script, that makes a .tar.gz of a folder I define, say fx /root/tekkit/world This .tar.gz file should then be uploaded to a FTP server, named by the time it was uploaded, for example: 07-10-2012-13-00.tar.gz How should such backup script be written? I already figured out the .tar.gz part - just need the naming and the uploading to FTP. I know that FTP is not the most secure way to do it, but as it is non-sensitive data, and FTP is the only option I have, it will do. Edit: I ended up with this script: #!/bin/bash # have some path predefined for backup unless one is provided as first argument BACKUP_DIR="/root/tekkit/world/" TMP_DIR="/tmp/tekkitbackup/" FINISH_DIR="/tmp/tekkitfinished/" # construct name for our archive TIME=$(date +%d-%m-%Y-%H-%M) if [ $1 ]; then BACKUP_DIR="$1" fi echo "Backing up dir ... $BACKUP_DIR" mkdir $TMP_DIR cp -R $BACKUP_DIR $TMP_DIR cd $FINISH_DIR tar czvfp tekkit-$TIME.tar.gz -C $TMP_DIR . # create upload script for lftp cat <<EOF> lftp.upload.script open server user user password lcd $FINISH_DIR mput tekkit-$TIME.tar.gz exit EOF # start backup using lftp and script we created; if all went well print simple message and clean up lftp -f lftp.upload.script && ( echo Upload successfull ; rm lftp.upload.script )

    Read the article

  • Why won't my computer go to sleep automatically?

    - by Django Reinhardt
    Windows 8 is set to sleep after 30 mins, and it used to work, but recently it's started refusing to sleep. (I can still manually ask it to go to sleep without any issue.) I was having issues a while ago, but it was with my network adapter. That's since been disabled, so it's definitely not that: I've checked to see what devices are able to wake up my machine, but it only appears to be my mouse: Which is odd, because I haven't recently changed my mouse, and more confusing still: The monitor does go to sleep just fine. If it was actually the mouse keeping my system awake, I'm pretty sure the monitor wouldn't go to sleep. I've checked my Wake Timers, and nothing: I've also checked my existing requests... UPDATE: I found something. What to do with it, I don't know... Note: Even when /requests says that there's "NONE" under every category, my machine still won't sleep(!). In short: How can I tell what's preventing my computer from Sleeping? UPDATE: Ok, so I now have a few more pieces of the puzzle. I came back to my computer and it was ASLEEP! Lawks! It seems that the only times it doesn't sleep is if VLC Player is open, even if a video isn't actually playing. UPDATE UPDATE: Ok, so it won't sleep sometimes when VLC Player ISN'T running, either. Bah!

    Read the article

  • Juniper’s Network Connect ncsvc on Linux: “host checker failed, error 10”

    - by hfs
    I’m trying to log in to a Juniper VPN with Network Connect from a headless Linux client. I followed the instructions and used the script from http://mad-scientist.us/juniper.html. When running the script with --nogui switch the command that gets finally executed is $HOME/.juniper_networks/network_connect/ncsvc -h HOST -u USER -r REALM -f $HOME/.vpn.default.crt. I get asked for the password, a line “Connecting to…” is printed but then the programm silently stops. When adding -L 5 (most verbose logging) to the command line, these are the last messages printed to the log: dsclient.info state: kStateCacheCleaner (dsclient.cpp:280) dsclient.info --> POST /dana-na/cc/ccupdate.cgi (authenticate.cpp:162) http_connection.para Entering state_start_connection (http_connection.cpp:282) http_connection.para Entering state_continue_connection (http_connection.cpp:299) http_connection.para Entering state_ssl_connect (http_connection.cpp:468) dsssl.para SSL connect ssl=0x833e568/sd=4 connection using cipher RC4-MD5 (DSSSLSock.cpp:656) http_connection.para Returning DSHTTP_COMPLETE from state_ssl_connect (http_connection.cpp:476) DSHttp.debug state_reading_response_body - copying 0 buffered bytes (http_requester.cpp:800) DSHttp.debug state_reading_response_body - recv'd 0 bytes data (http_requester.cpp:833) dsclient.info <-- 200 (authenticate.cpp:194) dsclient.error state host checker failed, error 10 (dsclient.cpp:282) ncapp.error Failed to authenticate with IVE. Error 10 (ncsvc.cpp:197) dsncuiapi.para DsNcUiApi::~DsNcUiApi (dsncuiapi.cpp:72) What does host checker failed mean? How can I find out what it tried to check and what failed? The HostChecker Configuration Guide mentions that a $HOME/.juniper_networks/tncc.jar gets installed on Linux, but my installation contains no such file. From that I concluded that HostChecker is disabled for my VPN on Linux? Are the POST to /dana-na/cc/ccupdate.cgi and “host checker failed” connected or independent? By running the connection over a SSL proxy I found out that the POST data is status=NOTOK (Funny side note: the client of the oh-so-secure VPN does not validate the server’s SSL certificate, so is wide open to MITM attacks…). So it seems that it’s the client that closes the connection and not the server.

    Read the article

  • How to create domain or router-level workgroup (dd-wrt micro)

    - by Anthony
    In Windows, is active directory required for using "Domain" instead of "workgroup"? Do I need to register a domain with a DNS provider like godaddy? What I really want to do is set up my home LAN so that everyone connecting to the main router (which is everyone, which is about 30 people) can see each other. I've tried having everyone use the same work group name, still hit or miss. I tried setting the domain name and host name on the router itself, still nothing. I've tried joining the domain name I set instead of work group, and I get an AD error. But ideally, everyone who is connected to the main router should simply just see each other and any shared folders. I've had this problem when I was not the network admin on other large LANs, and I've never been able to figure out why sometimes people disappear or never see each other. I'd really prefer using the native sharing functionality in the OS to setting up an internal FTP or Samba server, etc. Any sure-fire ways to fix this? (maybe an open source clone of AD?) Thanks!

    Read the article

  • Why java -version returning a different version than the one defined in JAVA_HOME?

    - by Shekhar
    I am trying to set JAVA_HOME in Ubuntu OS. I have copied jdk 1.7 in /usr/lib/jvm and set JAVA_HOME in /etc/profile file. Contents of /usr/lib/jvm folder are as follows : shekhar@ubuntu:~$ ls /usr/lib/jvm/ default-java java-1.6.0-openjdk java-6-openjdk java-6-openjdk-i386 jdk1.7.0_01 java-1.5.0-gcj-4.6 java-1.6.0-openjdk-i386 java-6-openjdk-common java-7-openjdk-i386 and last few lines of /etc/profile file are as follows : export JAVA_HOME=/usr/lib/jvm/jdk1.7.0_01 export PATH=$PATH:$JAVA_HOME/bin After finishing all this when I run java -version command I get following output : shekhar@ubuntu:~$ java -version java version "1.6.0_24" OpenJDK Runtime Environment (IcedTea6 1.11.4) (6b24-1.11.4-1ubuntu0.12.04.1) OpenJDK Server VM (build 20.0-b12, mixed mode) and when I run ls -lah command I get following output : shekhar@ubuntu:~$ ls -lah /usr/bin/java lrwxrwxrwx 1 root root 22 Sep 29 09:58 /usr/bin/java -> /etc/alternatives/java shekhar@ubuntu:~$ ls -lah /etc/alternatives/java lrwxrwxrwx 1 root root 45 Sep 29 09:58 /etc/alternatives/java -> /usr/lib/jvm/java-6-openjdk-i386/jre/bin/java Can anyone please tell me which thing I am missing? Why Ubuntu is still pointing to open jdk and not to my jdk 7? PS : I have seen this similar question and its answers but that question is related to Windows OS and not for Ubuntu so I am reposting this similar question for Ubuntu.

    Read the article

  • EFS Remote Encryption

    - by Apoulet
    We have been trying to setup EFS across our domain. Unfortunately Reading/Writing file over network share does not work, we get an "Access Denied" error. Another worrying fact is that I managed to get it working for 1 machine but no other would work. The machines are all Windows 2008R2, running as VM under ESXi host. According to: http://technet.microsoft.com/en-us/library/bb457116.aspx#EHAA We setup the involved machine to be trusted for delegation The user are not restricted and can be trusted for delegation. The users have logged-in on both side and can read/write encrypted files without issues locally. I enabled Kerberos logging in the registry and this is the relevant logs that I get on the machine that has the encrypted files. In order for all certificate that the user possess (Only Key Name changes): Event ID 5058: Audit Success, "Other System Events" Key file operation. Subject: Security ID: {MyDOMAIN}\{MyID} Account Name: {MyID} Account Domain: {MyDOMAIN} Logon ID: 0xbXXXXXXX Cryptographic Parameters: Provider Name: Microsoft Software Key Storage Provider Algorithm Name: Not Available. Key Name: {CE885431-9B4F-47C2-8415-2D766B999999} Key Type: User key. Key File Operation Information: File Path: C:\Users\{MyID}\AppData\Roaming\Microsoft\Crypto\RSA\S-1-5-21-4585646465656-260371901-2912106767-1207\66099999999991e891f187e791277da03d_dfe9ecd8-31c4-4b0f-9b57-6fd3cab90760 Operation: Read persisted key from file. Return Code: 0x0[/code] Event ID 5061: Audit Faillure, "System Intergrity" [code]Cryptographic operation. Subject: Security ID: {MyDOMAIN}\{MyID} Account Name: {MyID} Account Domain: {MyDOMAIN} Logon ID: 0xbXXXXXXX Cryptographic Parameters: Provider Name: Microsoft Software Key Storage Provider Algorithm Name: RSA Key Name: {CE885431-9B4F-47C2-8415-2D766B999999} Key Type: User key. Cryptographic Operation: Operation: Open Key. Return Code: 0x8009000b Could this be related to this error from the CryptAcquireContext function NTE_BAD_KEY_STATE 0x8009000BL The user password has changed since the private keys were encrypted. The problem is that the users I using at the moment can not change their password.

    Read the article

  • How do I set up a shared directory on Linux?

    - by JR Lawhorne
    I have a linux server I want to use to share files between users in my company. Users will access the machine with sftp or secure shell. Here is what I have: cd /home ls -l drwxrwsr-x 5 userA staff 4096 Jul 22 15:00 shared (other listings omitted) I want all users in the staff group to be able to create, modify, delete any file and/or directory in the shared folder. I don't want anyone else to have access to the folder at all. I have: Added the users to the staff group by modifying /etc/group and running grpconv to update /etc/gshadow Run chown -R userA.staff /home/shared Run chmod -R 2775 /home/shared Now, users in the staff group can create new files but they aren't allowed to open the existing files in the directory for edit. I suspect this is due to the primary group id associated with each user which is still set to be the group created when the user was created. So, the PGID of user 'userA' is 'userA'. I'd rather not change the primary group of the users to 'staff' if I can help it but if it is the easiest option, I would consider it. And, a variation on a theme, I'd like to do this same thing with another directory but also allow the apache user to read files in the directory and serve them. What's the best way to set this up?

    Read the article

  • 2 servers on 2 networks in same office

    - by irot
    Hello Gents, My office doesn't have a "server guy" in employ, so I'm stuck with having to fix server issues for now. There are 2 servers in our office, both are file/web servers only accessible via LAN. They are currently on the same network, so no issue there. Problem is, we recently got a static IP to use, but it's with a different ISP, so now we have 2 routers in our office. I would like to open one of the servers to the public as a web/FTP server. But if I hook a server up to the new router, users will no longer be able to access the files shared on that server (because they're on different networks). How can I go about making one server accessible to the public using the static IP line, but still able to share the files on it to the users connected to the other network? The server I want to make public is running Windows Server 2008, the other server Windows Server 2003. And as far as I know, IP addresses are assigned by the router. I'm just a developer, don't know much about networking. Thank you in advance.

    Read the article

  • Squid: The request or reply is too large

    - by Ueli
    I have done a reverse proxy with an Apache in the background (on the same server). All works great but I can't open one page. I get the error "The request or reply is too large." In my cache.log contains: 2010/12/09 15:28:29| WARNING: http.c:971: HTTP header too large 2010/12/09 15:29:03| ctx: enter level 0: 'http://server/admin/cms/nav' 2010/12/09 15:29:03| httpProcessReplyHeader: Too large reply header 2010/12/09 15:29:03| ctx: exit level 0 In my squid.conf i disabled the limitations of the request and reply header, without success: reply_body_max_size 0 allow all request_body_max_size 0 Does someone know why that don't work? Thank you very much. Squid Version: Squid Cache: Version 2.7.STABLE3 configure options: '--prefix=/usr' '--exec_prefix=/usr' '--bindir=/usr/sbin' '--sbindir=/usr/sbin' '--libexecdir=/usr/lib/squid' '--sysconfdir=/etc/squid' '--localstatedir=/var/spool/squid' '--datadir=/usr/share/squid' '--enable-async-io' '--with-pthreads' '--enable-storeio=ufs,aufs,coss,diskd,null' '--enable-linux-netfilter' '--enable-arp-acl' '--enable-epoll' '--enable-removal-policies=lru,heap' '--enable-snmp' '--enable-delay-pools' '--enable-htcp' '--enable-cache-digests' '--enable-underscores' '--enable-referer-log' '--enable-useragent-log' '--enable-auth=basic,digest,ntlm,negotiate' '--enable-negotiate-auth-helpers=squid_kerb_auth' '--enable-carp' '--enable-follow-x-forwarded-for' '--with-large-files' '--with-maxfd=65536' 'amd64-debian-linux' 'build_alias=amd64-debian-linux' 'host_alias=amd64-debian-linux' 'target_alias=amd64-debian-linux' 'CFLAGS=-Wall -g -O2' 'LDFLAGS=' 'CPPFLAGS='

    Read the article

  • MySQL partition "full"?

    - by gdea73
    I have a server that runs Debian 6.2, with Apache, PHP5, and MySQL. Well, I hadn't done anything with MySQL at all so far, just Apache and PHP; I must have installed it (mysql-server) at some point along the line, and I decided to login to the database for the first time a couple days ago as I was considering using the database for a future website project. I noticed that the "root" user had a password, and I didn't recall having set one. My usual root password was incorrect. So I attempted to reset the password. sudo service mysql stop (stopped successfully) sudo /usr/bin/mysqld_safe --skip-grant-tables --skip-networking & started successfully, from what I can tell. However, mysql itself returns "Can't connect to local MySQL server through socket '/var/run/mysqld/mysqld,sock' (2)", and additionally sudo service mysql start returns "/etc/init.d/mysql: ERROR: The partition with /var/lib/mysql is too full! ... failed!" df -h tells me that / is 26% used, a 20GB partition, and /home, roughly 900GB, has only 5% usage. On a potentially related note, I've been experiencing random hangs since I noticed this problem, my tty2 randomly froze several times while idle, and the entire system is suddenly unstable. gnome-terminal also does not open. (Gnome-terminal apparently works now, disregard that part, but the server is still being somewhat unstable, I randomly lost connection when I was SSHed into it from my laptop, twice now.)

    Read the article

  • "Network Error - 53" while trying to mount NFS share in Windows Server 2008 client

    - by Mike B
    CentOS | Windows 2008 I've got a CentOS 5.5 server running nfsd. On the Windows side, I'm running Windows Server 2008 R2 Enterprise. I have the "Files Services" server role enabled and both Client for NFS and Server for NFS are on. I'm able to successfully connect/mount to the CentOS NFS share from other linux systems but am experiencing errors connecting to it from Windows. When I try to connect, I get the following: C:\Users\fooadmin>mount -o anon 10.10.10.10:/share/ z: Network Error - 53 Type 'NET HELPMSG 53' for more information. (IP and share name have been changed to protect the innocent :-) ) Additional information: I've verified low-level network connectivity between the Windows client and the NFS server with telnet (to the NFS on TCP/2049) so I know the port is open. I've further confirmed that inbound and outbound firewall ports are present and enabled. I came across a Microsoft tech note that suggested changing the "Provider Order" so "NFS Network" is above other items like Microsoft Windows Network. I changed this and restarted the NFS client - no luck. I've confirmed that the share folder on the NFS server is readable/writable by all (777) I've tried other variations of the mount command like: mount 10.10.10.10:/share/ z: and mount 10.10.10.10:/share z: and mount -o anon mtype=hard \\10.10.10.10:/share * No luck. As per the command output, I tried typing NET HELPMSG 53 but that doesn't tell me much. Just "The network path was not found". I'm lost on how to proceed with troubleshooting. Any ideas?

    Read the article

  • FreeBSD rc.d script doesn't work when starting up

    - by kastermester
    I am trying to write a rc.d script to startup the fastcgi-mono-server4 on FreeBSD when the computer starts up - in order to run it with nginx. The script works when I execute it while being logged in on the server - but when booting I get the following message: eval: -applications=192.168.50.133:/:/usr/local/www/nginx: not found The script looks as follows: #!/bin/sh # PROVIDE: monofcgid # REQUIRE: LOGIN nginx # KEYWORD: shutdown . /etc/rc.subr name="monofcgid" rcvar="monofcgid_enable" stop_cmd="${name}_stop" start_cmd="${name}_start" start_precmd="${name}_prestart" start_postcmd="${name}_poststart" stop_postcmd="${name}_poststop" command=$(which fastcgi-mono-server4) apps="192.168.50.133:/:/usr/local/www/nginx" pidfile="/var/run/${name}.pid" monofcgid_prestart() { if [ -f $pidfile ]; then echo "monofcgid is already running." exit 0 fi } monofcgid_start() { echo "Starting monofcgid." ${command} -applications=${apps} -socket=tcp:127.0.0.1:9000 & } monofcgid_poststart() { MONOSERVER_PID=$(ps ax | grep mono/4.0/fastcgi-m | grep -v grep | awk '{print $1}') if [ -f $pidfile ]; then rm $pidfile fi if [ -n $MONOSERVER_PID ]; then echo $MONOSERVER_PID > $pidfile fi } monofcgid_stop() { if [ -f $pidfile ]; then echo "Stopping monofcgid." kill $(cat $pidfile) echo "Stopped monofcgid." else echo "monofcgid is not running." exit 0 fi } monofcgid_poststop() { rm $pidfile } load_rc_config $name run_rc_command "$1" In case it is not already super clear, I am fairly new to both FreeBSD and sh scripts, so I'm kind of prepared for some obvious little detail I overlooked. I would very much like to know exactly why this is failing and how to solve it, but also if anyone has a better way of accomplishing this, then I am all open to ideas.

    Read the article

  • Anyone know a good mind mapper that works with a scheduler?

    - by GLycan
    TL;DR: Mind mapping tasks to be processed into a schedule based on task metadata. I have all sorts of ideas about what to invest resources (mainly time) in, but when I actually have time to do something I useually end up browsing reddit for not knowing what do to, and the frequancy with which I forget deadlines scares me. I'd love to bring order and structure into my mind, and always know what to do next. So, I want a mind mapping app, where I'd give each branch (types and subtypes of things I want to do) a importance score (if there were two branches, and one had 60 while the other 40, they would respectivily get 60% and 40% of the parent's importance, with the root being 100) and a how soon that branch should be revised/updated (an hobby I want to try out might be checked, say, once a week, while a school subject should be checked once a day) and give each leaf (something I want/need to do) how much time it takes, deadline (if any), and optionally an absolute importance, reoccurrence (guitar practice might repeat once a week), and prerequisites (reading something requires that book (although that could be brought somewhere), coding requires a box, jogging requires being outside) and maybe some other flags, like if it's enjoyable or not. It should either be packaged or working with a schedular app, to which I'd say, look, my day works this way (completely busy from 8 to 9:15, then 15 minutes of being inside with nothing, ..., two hours with box and possibility to go outside, etc), saying that such-and-such pattern is school and happens ever weekday except such-and-such days. The output should be of the form of a schedule, fit for printing or, when I finally get an android, mobile viewing, that schedules tasks with regards to availability of resources and importance (importance being derived from the leaf-task's parent branches), and the set of flags (all work and no play makes me a dull boy). One of these tasks should be reviewing anything that should be updated on that day, including future day layouts (e.g, if the time slots of future days have changed. This should be done every day.) Does anyone know some collection of preferably open-source (or free, or pirateable) tools, or better yet a single one, that accomplishes this task? I know python pretty well, and should be able to write any necessary glue.

    Read the article

  • Can I run Excel 2010 on a server?

    - by Glen Little
    This question is not about a person using Excel on a computer that happens to have an Windows Server OS. And it is not about using any Sharepoint services features! The question is about automated processes that use code (Office Automation) to open Excel files, manipulate them, run calculations, read data, save copies of the file and close the files... all in code. In previous versions of Excel the licensing agreement prevented use on a public server, notes from Microsoft warned about the problems trying to use Office Automation in a server environment, and we were warned that Excel was single threaded and not designed for use on a server. Most of the articles about this were written before Office 2010. But now, Excel 2010 is designed to work on a High Performance Computing server using HPC Services for Excel. One HPC document mentions "Windows HPC Server 2008 R2 includes a comprehensive pop-up manager that can handle occasional dialog boxes and pop-up messages". So my question is... is it now "safe" to run code that automates Excel 2010 on a "normal" server without using the HPC services? If not, can the HPC Services for Excel work on a single server? I don't need the high performance, distributed computing, aspect of HPC Services for Excel... just the ability to run Excel on a server. Can that now be done? Thanks, Glen

    Read the article

< Previous Page | 933 934 935 936 937 938 939 940 941 942 943 944  | Next Page >