Search Results

Search found 21091 results on 844 pages for 'dsl client'.

Page 580/844 | < Previous Page | 576 577 578 579 580 581 582 583 584 585 586 587  | Next Page >

  • How to connect to my US network overseas via VPN?

    - by GiH
    I purchased an Apple TV for my parents and I have a netflix account. My parents live overseas, and I was wondering if they could use my account to get it to work. I read that it won't work unless you use proxies or a VPN, so I was wondering if its possible for me to setup a VPN to my network in the US instead of paying a service like StrongVPN? Setup: Router in US - Airport Extreme Router abroad - D-link (not sure of model) I know that the AppleTV doesn't have a built-in VPN client, maybe eventually when its jailbroken there will be an app, but as of now I'll have to use the routers right? Any other ideas are welcome as well!

    Read the article

  • Install/import SSL certificate on Windows Server 2003/IIS 6.0

    - by ChristianSparre
    Hi A couple of months ago we ordered an SSL certificate for a client's server using the request guide in IIS 6.0. This worked fine and the guide was completed when we received the certificate. But about 2 weeks ago the server crashed and had to be restored. Now I can't seem to get the site running. I have the .cer file, but what is the correct procedure to import the the certificate? I hope some of you can help me.. -- Christian

    Read the article

  • Behaviour of nginx as proxy

    - by HD
    I'm testing nginx with different configurations to replace an architecture working with squid + apache. I know that I can use nginx to manage static requests and load balancing but I'm interested in one particular solution that I don't understand clearly: I'm using 2 nginx servers (balanced) with the proxy_pass setting to pass all requests to an apache server. When one client makes a request to the site one of the nginx servers process it and send it to the apache server. Now, how this behaviour could be an improvement to my system?, it seems that all requests are passing through apache and I don't see benefit at all. What happens when 100 simultaneous connections pass through nginx? The 100 connections will be going to the apache server or is some kind of internal behaviour that allows an small impact into apache?

    Read the article

  • Windows Proxy Server advice

    - by Scott
    I have a webserver that currently has about 10 IP addresses. I have various clients that require a proxy server to route their internal traffic through. The load is not that great, so I'd like to have this ONE server act as a proxy server for 10 different clients, each client having their own unique IP on the server. The hardware is already setup, but I'm wondering what software solutions you guys recommend? I've looked at WinGate, Squid-Proxy, etc...but am pretty green with this. Maybe there's even a way to have Windows do this natively? I'm running Windows Server 2008, 32 bit.

    Read the article

  • Dropbox style folder for network drives

    - by Toby Allen
    Does anyone know of a third party (or even windows native) solution to this simple problem. I want to map an internal network share on our windows server to a folder on each of the client machines in the network. I dont want to have to use Drive Letters I would just like to set up a Folder on my C drive that is actually a windows share eg C:\DATA\Network Docs is actuallyp pointing to \\Server\SharedData\ Is this possible? Is there a third party solution that does it? All clients are Windows XP and Windows 7.

    Read the article

  • Avoid cache overflow in Atempo LiveBackup

    - by Vebjorn Ljosa
    When attempting the initial backup of a new client, Atempo LiveBackup seems to require a very large cache. For instance, a 20 GB cache is not enough to back up a computer that has 100 GB of data. It appears that LiveBackup is adding new files to the cache at a faster rate than it can send them to the server. When the cache fills up, the backup fails. Aside from removing most data from the computer and then add them back gradually after the initial backup, is there a good workaround? Is it possible to make LiveBackup slow down its scan so as to not fill the cache? Or is it possible to place the cache on an external drive?

    Read the article

  • Dealing with Fine-Grained Cache Entries in Coherence

    - by jpurdy
    On occasion we have seen significant memory overhead when using very small cache entries. Consider the case where there is a small key (say a synthetic key stored in a long) and a small value (perhaps a number or short string). With most backing maps, each cache entry will require an instance of Map.Entry, and in the case of a LocalCache backing map (used for expiry and eviction), there is additional metadata stored (such as last access time). Given the size of this data (usually a few dozen bytes) and the granularity of Java memory allocation (often a minimum of 32 bytes per object, depending on the specific JVM implementation), it is easily possible to end up with the case where the cache entry appears to be a couple dozen bytes but ends up occupying several hundred bytes of actual heap, resulting in anywhere from a 5x to 10x increase in stated memory requirements. In most cases, this increase applies to only a few small NamedCaches, and is inconsequential -- but in some cases it might apply to one or more very large NamedCaches, in which case it may dominate memory sizing calculations. Ultimately, the requirement is to avoid the per-entry overhead, which can be done either at the application level by grouping multiple logical entries into single cache entries, or at the backing map level, again by combining multiple entries into a smaller number of larger heap objects. At the application level, it may be possible to combine objects based on parent-child or sibling relationships (basically the same requirements that would apply to using partition affinity). If there is no natural relationship, it may still be possible to combine objects, effectively using a Coherence NamedCache as a "map of maps". This forces the application to first find a collection of objects (by performing a partial hash) and then to look within that collection for the desired object. This is most naturally implemented as a collection of entry processors to avoid pulling unnecessary data back to the client (and also to encapsulate that logic within a service layer). At the backing map level, the NIO storage option keeps keys on heap, and so has limited benefit for this situation. The Elastic Data features of Coherence naturally combine entries into larger heap objects, with the caveat that only data -- and not indexes -- can be stored in Elastic Data.

    Read the article

  • Solutions for exporting a remote desktop app (display and audio)

    - by Richard
    I'm looking for a solution that will allow me to export a desktop app running on a server to a client machine. The server is ideally Linux, the desktop is Windows (+Mac for icing on the cake). The export should be encrypted and I need to support multiple clients from one server. I only want to export an individual app, not a whole desktop, and ideally am looking for open source solutions. The obvious, cheapest, simplest choice is to use X tunnelled over ssh (e.g using Xming on the desktop) but X doesn't support audio. What are the alternatives? Or is there a way to support audio using X or in parallel to X? Thanks

    Read the article

  • Cannot redirect ip traffic with iptables to new ip on linux centOS

    - by Kiwi
    today I able to migrate some of the game servers to another server and needed some help to redirect the traffic from old ip to the new one. SERVER1 1.1.1.1 ----- (internet ) ----- SERVER 2.2.2.2 I asume to use iptables to perform this, for that used this rule on my centOS box in the server1. /etc/sysctl.conf: net.ipv4.ip_forward = 1 iptables -t nat -A PREROUTING -p udp --dest 1.1.1.1 --dport 27015 -j DNAT --to-destination 2.2.2.2:27015 iptables -t nat -A POSTROUTING -j MASQUERADE iptables -t nat -A POSTROUTING -d 2.2.2.2 -p udp --dport 27015 -j SNAT --to 1.1.1.1 But the client cannot connect to the server from the old ip, the redirection don't started.

    Read the article

  • FTP gives me a error when uploading and deleting files [on hold]

    - by AR Games
    Here's the error I get when trying to delete files... Command: DELE index.html Response: 550 Delete operation failed. Here's the error I get when trying to upload files... Command: OPTS UTF8 ON Response: 200 Always in UTF8 mode. Status: Connected Status: Starting upload of C:\wamp\www\.DS_Store Command: CWD /var/www/html Response: 250 Directory successfully changed. Command: TYPE A Response: 200 Switching to ASCII mode. Command: PASV Response: 227 Entering Passive Mode (76,185,76,101,78,222). Command: STOR .DS_Store Response: 553 Could not create file. Error: Critical file transfer error Status: Retrieving directory listing... Command: TYPE I Response: 200 Switching to Binary mode. Command: PASV Response: 227 Entering Passive Mode (76,185,76,101,23,94). Command: LIST Response: 150 Here comes the directory listing. Response: 226 Directory send OK. Status: Directory listing successful Response: 421 Timeout. Error: Connection closed by server Status: Disconnected from server IM running windows OS and using filezilla FTP client

    Read the article

  • FTP server (vsftpd) with webgui

    - by manutenfruits
    I want to build a file server to make users able to upload and download mostly multimedia, but also common files. Right now I have an Arch installation with vsftpd and I'm about to install miniDLNA for multimedia sharing. The only problem is that FTP doesn't seem to fit my needs, because almost always makes the users need a client such as FileZilla to make the server friendly. I have been looking for a web frontend for vsftp but apart from management interfaces there's nothing. I need a frontend accessible from a browser through which users can navigate throught the folders in an easier and more elegant way than the plain FTP display that browsers make by default. It should be able to let users upload files and, as an awesome extra, let them play the multimedia directly on the browser. For this, I am willing to dump FTP if needed, I've heard about HTTP File Servers but don't know too much about it. I could code everything myself, but there's gotta be something out there already.

    Read the article

  • Is my website running on an iPhone?

    - by Stefano Borini
    Provocative question, but in any case, this is what it would appear... unless there's some other reason, of course. In my cystat wordpress log, I obtained the following entry IP Browser OS Date Method Type URL 127.0.0.1 Safari 419.3 iPhone July 30, 2009 7:39 pm GET BLOG /blog/ The IP address is the IP of the visiting client. It's clear that this is not possible. Why do I get 127.0.0.1 as IP? cystat bug? some weird network trick I am not aware of? or is my website really running on an iPhone, and the guy at the applestore is reading my blog ?

    Read the article

  • Recovering an old website

    - by noah
    I have a client with an old website that somebody setup for him long ago. The guy who set it up is unreachable, so how do we go about trying to take it over? A WHOIS lookup got us some contact information, but I don't have great hopes for that (it hasn't been update in quite some time). The nameservers are ns1.theplanet.com and ns2.theplanet.com, and we will try calling them, but I don't expect we'll be able to get much from them. What are our options? Is there a way I can discover the registrar so we can try contacting them as well? EDIT: It would be sufficient if we could get control of the domain name or put in some sort of redirect to the new site. Either hosting was prepaid for quite some time, or someone else is still paying for it, so we don't care about that.

    Read the article

  • Why is access to my database very slow?

    - by Fabien
    I have a mysql database that used to work perfectly fine, but now it is dead slow on startup. When I type in $> mysql -u foo bar I get the following usual message for about 30 seconds before I get a prompt : Reading table information for completion of table and column names You can turn off this feature to get a quicker startup with -A Of course, I tried it and it goes a lot faster : $> mysql -u foo bar -A But why do I have to wait so long in regular startup ? This is not a very big database, and data does not seem to be corrupted (everything looks fine after startup). I have no other client connecting to the mysql server at the same time (only one process is shown with the command show full processlist) and I have already restarted the mysqld service. What's going on ?

    Read the article

  • How can one automatically logon to multiple user accounts in Windows 2008 R2

    - by DJFriar
    We are running a Windows 2008 R2 Terminal Server. Currently, we have local admin accounts created, one for each client that runs our software (SiteA, SiteB, etc). We need these user accounts to auto logon if the server is rebooted. The accounts need to run a full user environment, as we will login remotely at times via TeamViewer to check processes and makes changes, etc. We are using the Registry Hack method now, but that only allows one account to logon. I've seen a program called LogonExpert, but I've never heard of it so I don't know how trust worthy it is, etc. Is there any other way to auto logon to multiple accounts in our environment? Currently the users are local users, but we could make them domain users if that is required.

    Read the article

  • LAMP setup - phpmyadmin says the mysqli extension is missing (but its listed in phpinfo)

    - by WebweaverD
    I regularly set up virtualbox ubuntu setups to run as local webservers. I have set these up several times and never had an issue. Recently I have been cloning them but wanted to do a fresh install this time in the hopes of fixing some niggling problems which have propagated through my setups. However, something has changed: 1)vb guest additions no longer allow me to copy and paste (i'll worry about that later) 2)more importantly phpmyadmin no longer works as installed - Initially going to localhost/phpmyadmin gave a message that the page could not be found. So I have followed some instructions (sorry I know its vague I cant find them now) which have created a phpmyadmin directory in /var/www but now I get an error saying: the mysqli extension is missing. If I run phpinfo mysql and mysqli are listed. All I have done so far is install apache2 (working) install php5 (which I think used to come with apache) Install mysql server (and client for good measure) and install phpmyadmin I found a post of a similar question which suggested I should install php5-mysql (done) and edit php.ini and uncomment the line extension-mysqli.so - this is not there, so I tried adding it with no joy. I have restarted apache and still no joy on phpmyadmin. Any help is much appreciated as this is driving me nuts. Why the change for the worse - I was just starting to like linux! I'm running a windows 7 machine and the guest os is ubuntu 12.04 - I ran apt-get update before doing anything so all packages should be the latest versions.

    Read the article

  • Forking a GPL dual licensed software with business owned copyrights

    - by Eric
    After receiving some threats of the copyrights holder of a dual licensed software(GPL2 and commercial) to buy the commercial version for projects in production, I am thinking to make a fork. In a case of GPL2 and commercially dual licensed with business owned copyrights software, is forking the GPL2 version an option? Also, is forking a good way to deal with such cases? Background information The software is a web CMS released under 2 versions a GPL2 free open source edition and a commercial edition including technical support and extra functionality. The problem is that now, basing their argumentation on the "distribution" definition of the GPL2, the company holding the copyrights argue that delivering the software and some extensions to a client is considered as a "distribution". And that such a "distribution" falls under the GPL2 obligation to release the custom made extension code. Custom made extensions are mainly designs, templates and very specific functionality. Basically they give me 3 choices: Buying the commercial licensed edition for projects based on the GPL in production, Deleting all the projects in production based on GPL2 version, Releasing all the extensions as GPL2 code. The first 2 options are nothing realistic for finished projects. The third option could be fine, but as most of the extensions are very specific, cleaning the code to make it usable by other users means lot of works and also I am not sure the clients will appreciate to have their website designs and specific functionality released publicly. The copyrights holding company even contacted some clients directly, giving them the "choice". I know that this is a very corporate interpretation of GPL2, and a such action is nothing close to legal, but as an independent developer, I don't want to take the risk to get involved in some long and tiring legal procedures. PS. This question was first asked on Stack Overflow where it felt out of the scope and closed, after reading the present site FAQ, discussing about software licensing seems fine.

    Read the article

  • Distributed Server Monitoring Solution

    - by MaterialEdge
    I belong to an independent IT firm that manages and maintains about 50 business clients networks, ranging from small 5 system networks to 200+ systems. Because we are unable to directly monitor each server at these locations (distributed over a very large area) on a regular basis I am looking for a method to monitor and alert us to any problems that may arise so that we can respond quickly with, hopefully, preventative measures. I'm not sure what solutions are available for this type of situation, but something that utilizes a central server at our business with all client servers sending alerts or logs to it for daily monitoring might work best. All these servers are running a Windows Server OS. In your opinion, what would be the best course of action to accomplish this?

    Read the article

  • How to control remote access to Sonicwall VPN beyond passwords?

    - by pghcpa
    I have a SonicWall TZ-210. I want an extremely easy way to limit external remote access to the VPN beyond just username and password, but I do not wish to buy/deploy a OTP appliance because that is overkill for my situation. I also do not want to use IPSec because my remote users are roaming. I want the user to be in physical possession of something, whether that is a pre-configured client with an encrypted key or a certificate .cer/.pfx of some sort. SonicWall used to offer "Certificate Services" for authentication, but apparently discontinued that a long time ago. So, what is everyone using in its place? Beyond the "Fortune 500" expensive solution, how do I limit access to the VPN to only those users who have possession of a certificate file or some other file or something beyond passwords? Thanks.

    Read the article

  • Oh snap! My RPi was upgraded to 512MB! Woo-hoo!

    - by hinkmond
    I ordered a Raspberry Pi Model B (256MB) over 4 months ago on backorder. When it finally came I saw it was upgraded to the new half a gig model! Woot! But, all was not perfect. Gary C. told me the shipped configuration of the new RPi models didn't have the right firmware for 512MB, and I had to upgrade the start.elf in the /boot directory to recognize all of the 512MB RAM. I did a "free" command, and sure enough saw only 240MB. Sadness. But, Gary gave me a copy of his start.elf which worked after some trail and error. For anyone ordering the new RPi Model B w/512MB, here are the steps to get you going with full 512MB RAM: sudo apt-get update --fix-missing sudo apt-get upgrade --fix-missing # NOTE: This step takes at least a couple hours on a # fast network wget https://raw.github.com/raspberrypi/firmware/\ 164b0fe2b3b56081c7510df93bc1440aebe45f7e/boot/\ arm496_start.elf sudo mv /boot/start.elf /boot/orig-start.elf sudo mv arm496_start.elf /boot/start.elf sudo reboot free total used free shared buffers cached Mem: 497768 210596 287172 0 16892 169624 -/+ buffers/cache: 24080 473688 Swap: 102396 0 102396 So of course this means... (drumroll) there is now 498MB available for the Java Embedded heap! java -Xmx400m -version java version "1.7.0_06" Java(TM) SE Embedded Runtime Environment (build 1.7.0_06-b24, headless) Java HotSpot(TM) Embedded Client VM (build 23.2-b09, mixed mode) Yeah, baby! Hinkmond

    Read the article

  • Staying on a registered-only IRC channel

    - by rwallace
    Freenode, like other IRC servers, has the property that one's connection will drop at the slightest hiccup. Fortunately mIRC knows to automatically reconnect when this happens. The problem lies with some channels such as #ai, which cannot be joined unless one's nickname is registered. mIRC doesn't know how to send the password to NickServ, and even if it did, at the time it reconnects, the original connection is still present on the server as a ghost; it doesn't know to wait a few minutes for the original connection to be garbage collected; thus it is not able to stay on such channels. Is there a way to solve this problem either with mIRC or some other IRC client that runs on Windows?

    Read the article

  • Dedicated server: managed hosting or manage it myself?

    - by ddawber
    We're currently hosting a number of sites on a self-managed dedicated server. Some companies, however, offer a managed dedicated server hosting service. They offer: Roughly the same server spec Ticketing system support Managed daily backups Virtual firewall (but with a limit of 10 IP addresses allowed through at any one time) Now, this managed hosting is at extra expense - somewhere in the region of $500 per month, and the limit on the number of IP addresses they'll manage on the firewall is also a real pain. My thinking is it would be better and cheaper to Stay with the same host since the dedicated box is fine Get an Amazon AWS account and use their server to manage backups; there are a number of good tools that can be used to automate the process Configure iptables so that I have complete control of the firewall I want to know Is a managed virtual firewall likely to be more secure than me configuring iptables? Whether, in your opinion, it's best to let someone else take care of backups? If, from your experience, there's anything else i'm missing that warrants using managed hosting over a DIY service? I think there is some reluctance to not having managed hosting since a managed host in effect takes responsibility for your server, whereas any hardware or security issues with a server that we manage would mean we are forced to hold our hands up when a client site goes down. That said, I personally don't think a managed host does that much in the day to day running of your server (backups are automatic, OS updates are carried out with ease, etc.).

    Read the article

  • RDP locks up login, doesn't unlock on Windows

    - by private_meta
    From time to time, my system, when I try to login TRHOUGH or AFTER a remote connection, locks up the login session. I can't login anymore, the screen turns black (the monitor is still active, the image is black). Especially in the recent case, the system did not come back from the lock-up, and I had to reset the computer. Any idea what might be the issue here? More information: Both Computers are Windows 7, The RDP Server has a wired connection, the Client has either Wireless or Wired. The network card involved on the server is a "Realtek RTL8168C(P)/8111C(P) Family PCI-E Gigabit Ethernet NIC (NDIS 6.20)" card built-in on an ASRock Mainboard. I'm using either local LAN or internet connection through NAT/Router.

    Read the article

  • No internet connection for some programmes after installing ad hoc wireless network

    - by Michael
    After installing a wireless network (through the program iPhoneModem) several programmes have stopped working when connected to the Internet using another wireless connection. Working programmes: Firefox (browser) uTorrent (p2p) FileZilla (ftp) etc. Programmes that are not working: Chrome (browser) Digsby (IM client) etc. I'm running Windows 7. I have tried to disable Windows Firewall entirely as well as AVG anti virus, with no effect. I've tried to run the FixIt program from Microsoft adressing a corrupt TCP/IP stack. This too had no effect. Any suggestions?

    Read the article

  • Managing service passwords with Puppet

    - by Jeff Ferland
    I'm setting up my Bacula configuration in Puppet. One thing I want to do is ensure that each password field is different. My current thought is to hash the hostname with a secret value that would ensure each file daemon has a unique password and that password can be written to both the director configuration and the file server. I definitely don't want to use one universal password as that would permit anybody who might compromise one machine to get access to any machine through Bacula. Is there another way to do this other than using a hash function to generate the passwords? Clarification: This is NOT about user accounts for services. This is about the authentication tokens (to use another term) in the client / server files. Example snippet: Director { # define myself Name = <%= hostname $>-dir QueryFile = "/etc/bacula/scripts/query.sql" WorkingDirectory = "/var/lib/bacula" PidDirectory = "/var/run/bacula" Maximum Concurrent Jobs = 3 Password = "<%= somePasswordFunction =>" # Console password Messages = Daemon }

    Read the article

< Previous Page | 576 577 578 579 580 581 582 583 584 585 586 587  | Next Page >