Search Results

Search found 6882 results on 276 pages for 'ftp proxy'.

Page 214/276 | < Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >

  • Serving a default image with nginx

    - by ustun
    I have the following configuration in nginx: location /static/ { root /srv/kose/; expires 2w; access_log off; } location / { proxy_pass http://127.0.0.1:8089; } If a file is not found in /static/, I want to serve a default image, and not proxy_pass to 8089. Currently, it looks for the file in the root for static, if it cannot find it, it tries the proxy. I have tried the following, but it doesn't work. How can I tell nginx to serve the default image? I have also tried try_files to no avail. location /static/ { root /srv/kose/; expires 2w; access_log off; error_page 404 /srv/static/defaultimage.jpg; } location / { proxy_pass http://127.0.0.1:8089; }

    Read the article

  • debian installation without internet connection

    - by Gobliins
    Hi i want to install some Debian distributions (Grip, Crush, Lenny...) for arm / armel architectures. www.emdebian.org/ i refer to this guide www.aurel32.net/info/debian_arm_qemu.php The Problem i have is that i dont have internet connection with My Linux VM or Qemu i am behind a Proxy. I want to know is there a way where i can dl all the needed files and save them to disk that i don´t need an i.c. during the installation? I am working under Windows now. my regards

    Read the article

  • Gmail/Facebook/Hotmail not opening in Firefox/IE on Windows 7 Home

    - by singlepoint
    Hi, I am unable to open Gmail/Facebook/Hotmail on Firefox/IE on Widows 7 Home. I just unboxed a brand new hp laptop with Norton Security Suite running inside. I get following error message on Firefox. Please help. The connection has timed out The server at www.google.com is taking too long to respond. * The site could be temporarily unavailable or too busy. Try again in a few moments. * If you are unable to load any pages, check your computer's network connection. * If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the Web

    Read the article

  • Linux shell to restrict sftp users to their home directories?

    - by hfidgen
    I need to give SFTP access to a directory within my webroot on my server. I've set up ben_files as a user and have set his/her home directory to /var/www/vhosts/mydomain.com/files That's all fine if he/she connects with plain old FTP - he's/she's restricted just to that directory, but to enable SFTP I had to add him/her to bin/bash shell, which suddenly opens up my entire server... Is there a way of giving him/her SFTP access but without opening up all my directories? I'd really like him restricted to only his/her home ;)

    Read the article

  • Cannot access internal website after being connected to other network

    - by Dandroid
    I have a client who claim they cannot access their internal website when they have e.g. been out traveling. They have to reset their browser settings every time to be able to get access again. As they live on another continent and timezone it's hard for me to run live tests with them to see what changes in their browser settings. My first guess would be it's some sort of Proxy related issue, but I want to know if there could be other reasons for this? It's not the LAN itself, that we are sure of. It's browser specific only. Edit It's worth mentioning that it only happens when they turn off/restart their computers.

    Read the article

  • ASP.NET, Visual Studio and Subversion - how to integrate?

    - by Michael Stum
    I use AnkhSVN and Visual Studio 2005 and 2008. Now, one thing that bugs me is that Ankh does not really work with ASP.NET sites. I cannot add them properly to a repository and it won't detect changes, especially because the site is on a remote server accessed through Frontpage Extensions (File = Open Site). What are the alternatives? Does a better plug-in exist? Manually downloading the files through FTP and using TortoiseSVN or svn.exe is not really the level of integration I want :) I want to stay within the Visual Studio IDE when possible. Also, I do not control the remote Server, so I can not install anything on it, which means the whole change tracking/comparison to repository has to be done on my machine.

    Read the article

  • How can I sqldump a huge database?

    - by meder
    SELECT count(*) from table gives me 3296869 rows. The table only contains 4 columns, storing dropped domains. I tried to dump the sql through: $backupFile = $dbname . date("Y-m-d-H-i-s") . '.gz'; $command = "mysqldump --opt -h $dbhost -u $dbuser -p $dbpass $dbname | gzip > $backupFile"; However, this just dumps an empty 20 KB gzipped file. My client is using shared hosting so the server specs and resource usage aren't top of the line. I'm not even given ssh access or access directly to the database so I have to make queries through PHP scripts I upload via FTP ( SFTP isn't an option, again ). Is there some way I can perhaps sequentially download portions of it, or pass an argument to mysqldump that will optimize it? I came across http://jeremy.zawodny.com/blog/archives/000690.html which mentions the -q flag and tried that but it didn't seem to do anything differently.

    Read the article

  • How to access website CMS with only access to database

    - by user1741615
    I have a website that uses an "in-house" cms and I don't know the login details. The platform itself doesn't have the "reset your password" functionality. I do have access to ftp and phmyadmin and I found the SQL table containing the user details, but of course the password is MD5 encryption. I tried manually creating a user in php my admin and filling in a password encrypted in MD5 (used a md service online for that), but it still doesn't work. Does anybody know other tricks I can use? Thanks.

    Read the article

  • Fully secured gateway web sites

    - by SeaShore
    Hello, Are there any web sites that serve as gateways for fully encrypted communication? I mean sites with which I can open a secured session, and then to exchange through them with other sites in a secure way both URLs and content? Thanks in advance. UPDATE Sorry for not being clear. I was wondering if there was a way to access any site over the Internet (http or https) without letting any Intranet-proxy read the requested URL or the received content. My question is whether such a site exists, e.g.: I am connected to that site via https, I send it a URL in a secured way, the site gets the content from the target site (possibly in a non-secured way) and returns to me the requested content in a secured way.

    Read the article

  • Nginx return 444 depending on upstream response code

    - by Mark
    I have nginx setup to pass to an upstream using proxy pass. The upstream is written to return a 502 http response on certain requests, rather then returning the 502 with all the header I would like nginx to recoginse this and return 444 so nothing is returned. Is this possible? I also tried to return 444 on any 50x error but it doesn't work either. location / { return 444; } location ^~ /service/v1/ { proxy_pass http://127.0.0.1:3333; proxy_next_upstream error timeout http_502; error_page 500 502 503 504 /50x.html; } location = /50x.html { return 444; } error_page 404 /404.html; location = /404.html { return 444; }

    Read the article

  • Disable address bar in Internet Explorer 9

    - by token
    I'm trying to disable the address bar in IE9. I've done a significant amount of searching on this and just can't seem to find a way to make it happen. A lot of web resources discuss how to do it in IE8, but not IE9. The reason you might ask? I have an application being hosted in a remote desktop farm that links to web pages outside of the application into Internet Explorer. I need to ensure users are limited to just going to the pages the program pushes them to. I realize I could use a proxy server to limit where they can go, but I'm trying to find a really simple way to just disable the address bar instead. I can't use Kiosk mode because it puts the browser into full screen mode. This won't work for my situation as I need to give users what appears to be a regular browsing experience without an address bar.

    Read the article

  • Malware on a client's website - Ideas?

    - by Jeriko
    We recently got a call from one of our clients, complaining that their site has some "strange looking code" at the bottom of the page. We checked out the source code, and discovered that about 800 bytes of malicious javascript code had been appended to the templates/master file, after the </html> tag. I won't post said code because it looked particularly nasty. As far as I can tell, there would be no way for this file to be edited in any way, unless someone had direct access to the server and/or FTP login details. The actual file itself has been modified, so that rules out any kind of SQL attack. Besides a person physically gaining credentials and hand-modifying this file, would there be any other logical explaination for what happened? Has anyone else had experience with something like this happening?

    Read the article

  • Template approach for a PHP application

    - by Industrial
    Hi everyone, We're in the middle of making a new e-commerce related PHP application and we have come to the point where we have started to think about how we should solve templating for our customers needs. What we would like to do is offer our customers the possibility of uploading/modifying templates to suit their company:s profile. The initial thought is that we shall not reinvent the wheel, so instead letting our customers upload their templates with FTP, so there will be basic HTML skills required. For those customers that want to modify/customize template and doesnt have the knowledge, we offer that service as well. I know that there's a number of issues to solve before this could be considered safe, like preventing XSS and writing scripts that check through each uploaded file for potential security threats and so on. Of course, there are some part that probably will be to complex for the customer to modify by themselves, so maybe this approach won't apply to all<< template files in the frontend application. But besides that, what would be a good way to handle this?

    Read the article

  • Run command on init and restart on errors

    - by chersanya
    I have internet access on my PC through proxy through SSH, so every time I need to execute ssh -L PORT:SERVER:PORT LOGIN@SERVER and then type a password. After each network failure or reconnect this command has to be executed again. I've got bored of it and look for a way to do this automatically: first run this after boot (it doesn't seem to be a problem - put this command in some init file and that's all) and then rerun it (if possible, then type password) on each network failure. Is it possible, and how? OS Linux (Debian)

    Read the article

  • How can I access my desktop computer from my Android phone?

    - by Qurben
    Is it possible to access a computer connected to the internet through an Android phone? (the internet goes through the phone by tethering) I want to use ssh to connect to the computer (from a different computer in the same network), but I am not able to access the computer. Is it possible to portforward, use some kind of transparent proxy or to use DMZ? My phone is rooted and I have Cyanogenmod installed and I can use iptables. EDIT: The changed title completely changed the question! My setup is the following: I have an android phone connected to a computer through the usb cable tethering internet from the phone, I wanted to ssh into the computer behind the android phone from another computer in the same network as the android phone. This was not possible, because the android phone creates a separate network for the connected computer, effectively shielding it from any incoming signals. It turned out to be quite simple to fix by just using iptables.

    Read the article

  • uploading files greater than 1MB = connection resets

    - by Legit
    I'm using nginx on the frontend as "proxy cache" and apache on the backend, i've set my PHP settings to the following: error_log = /var/www/site1/php_error.log error_reporting = 22527 file_uploads = On log_errors = On max_execution_time = 0 max_file_uploads = 20 max_input_time = -1 memory_limit = 512M post_max_size = 0 upload_max_filesize = 1000M What's the problem? Uploading files less than 1MB is successful but anything greater than that, Google Chrome outputs: Error 101 (net::ERR_CONNECTION_RESET): The connection was reset. I already checked for the error log file but it doesn't exist in the directory. I also checked /var/log/httpd/error_log but no uploading related problems. I don't know anything else which might have caused the problem so I have reached out for your helping hand. Thanks!

    Read the article

  • 1K incoming http post requests per second, each with a 10-50K file

    - by Blankman
    I'm trying to figure out what kind of server setup I will need to support: 1K http post requests per second each post will contain a xml file between 5-50K (average of 25 kilobytes) Even if I get a 100 Mb/s connection with my dedicated box (they usually give 10 Mb/s but you can upgrade), from my calculations that is about 12K kb/s which means about 480 25kb files per second. So this means I need around 3 servers then, each with 100 Mb/s connection. Would a single server running HAProxy be able to redirect the requests to other servers or does this mean I need to get something else that can handle more than 100 Mb/s to proxy things out to the other servers? If my math is off I'd appreciate any corrections you may have.

    Read the article

  • SQL Server 2005 to 2008 Bak file help please!

    - by Brandon
    I have a SQl Server 2005 database backup that I want to transfer to SQL Server 2008 on my server. I spent 3 days transferring the .bak file from my own machine to my server. I then tried to restore the bak file and I got an error. I then read online a completely different method for adding a SQL server 2005 Database to SQL server 2008 which was the detach and attach method which means I need to detach the database in SQL Server 2005 and then transfer the MDF file from it via ftp to my server and then attach it in SQL Server 2008. Well I already used a lot of bandwidth transferring the .bak file to my server. is there a way to convert my .bak file which is already on my server to an MDF file and attach it in SQL server 2008?

    Read the article

  • Any cloud storage service that lets us to authenticate the file when we serve the file to our visito

    - by TORr0t
    Lets say, i want to restrict a file to my visitors. I mean , i have an xx.avi file to be streamed/downloaded, and the visitor paid me for the bandwidth and the size of the file. In amazon s3, i cant control the file at all .(there is a very basic control thing which is not ok for me) Only way is my server can proxy the file, like it fetches the file from amazon s3 storagenode and send it to the owner with authentication approval by a php script. But this way i would double up the bandwidth usage and again there would be latency problem since my server needs to get the file from amazon s3. So i was wondering if there is a better solution or any cloud storage service that lets us to control the file restriction to my visitors. Thanks

    Read the article

  • Is is better to combine Apache for file manipulation and upload and Nginx for static file serving, or to use one of the two alone

    - by user1032393
    Based on my research, I've read that nginx is best and ideal for serving up static files and images. My application depends heavily on uploading of images and rewriting them, then serving them up. Given that I only have one VPS currently, it has been suggested that I use nginx to serve up the images and website, and reverse proxy to Apache (on the same VPS) to rewrite files with image magick and handle the file uploads. Which would be the best solution, Apache, Nginx, or Apache + Nginx? In terms of best solution, I'm looking at minimal average RAM consumption, while maintaining decent load speed of maybe sub 2 seconds?

    Read the article

  • Creation obex connection without service discovery

    - by Vio
    Hello there my problem is this,i want to create an obex connection but i already know my server adress.The problem is how to get the connection url without service discovery.I use a normal java project for my server side and a midlet project for my client side.I will post the code maybe it helps. Server connection UUID uuid=new UUID("8841",true); String url="btgoep://localhost:"+uuid+";name=FTP;authenticate=false;master=false;encrypt=false"; SessionNotifier sn=(SessionNotifier)Connector.open(url); System.out.println("Now waiting for a client to connect"); sn.acceptAndOpen(this); System.out.println("A client is now connected"); When i try to connect with the client it doesn't get to the last line :(....any ideas why?Hope u can help...thanks

    Read the article

  • Block all third party domains from web pages

    - by wizlb
    When I'm browsing the web, I'd like to not be tracked by any third party services like Facebook or Google. For instance, if I visit somepage.com I don't want my browser requesting things from facebook.com unless I allow it. However, if I visit facebook.com, Facebook still works. Does anyone know of a Chrome or Firefox extension that will allow me to do this? AdBlock in Chrome doesn't seem to work because it just hides the web page elements, it doesn't stop the browser from downloading them. I imagine that some kind of proxy/browser extension hybrid would be the best. Any suggestions? Thank you.

    Read the article

  • Is there a way to use something like RewriteRule ... [PT] for an external URL?

    - by nbolton
    I have a non-apache web server running on port 8000, but this cannot be accessed from behind corporate firewalls. So, I would like to use my apache 2 server as a proxy to this other web server. I've tried using: RewriteEngine On RewriteRule /.* http://buildbot.synergy-foss.org:8000/builders/ [PT] ... but this does not work; I get: Bad Request Your browser sent a request that this server could not understand. However, it worked fine with [R]. Update: Also, when using ProxyPass, I get this error: Forbidden You don't have permission to access / on this server.

    Read the article

  • headers already sent displays after moving files from one server to another

    - by PERR0_HUNTER
    hey there! I have a project running on dreamhost hosting and it's working fine, but since DH has been getting really slow I'm moving the project to my new dedicated server. The thing is that after I move all of my file over to the new dedicated (ubuntu 8.4) I get see warnings all over the place telling me that the headers had been already sent. The first thing I tried was moving the files via FTP: download to my machine, upload to server - Didn't worked Second try was tar.gz the folder on the first server and untar it on the new one, didn't worked either I tried chaing enconding to ANSI and they start working, however most of my files contain accents so ANSI is not an option for all my files i need UTF8 Any ideas on how to fix this? I'm sure it must be some sort of config

    Read the article

  • Deploy multiple django instances on one Host [migrated]

    - by tvn
    I am trying to setup multiple Django instances on one Host with lighttpd. My problem is to get Djangos FCGI working on subdirectories served by my Webserver. So my aim is the following: www.myhost.org/django0 - django1.fcgi on localhost:3000 www.myhost.org/django1 - django2.fcgi on localhost:3001 www.myhost.org/django2 - django3.fcgi on localhost:3002 Unfortunately the following configuration doesn't even work for one: $HTTP["url"] =~ "^/django0/static($|/)" { server.document-root = "/home/django0/django/static/" } $HTTP["url"] =~ "^/django0/media($|/)" { server.document-root = "/usr/lib/python2.7/dist-packages/django/contrib/admin/media/" } $HTTP["url"] =~ "^/django0($|/)" { proxy.server = ( "" => ( ( "host" => "127.0.0.1", "port" => "3001", "check-local" => "disable", ) ) ) } The only response I get is an 404 and even this takes a long time till I get this. I found nothing suspicious neither in the access.log nor in the error.log.

    Read the article

< Previous Page | 210 211 212 213 214 215 216 217 218 219 220 221  | Next Page >