Search Results

Search found 7306 results on 293 pages for 'wcf proxy'.

Page 239/293 | < Previous Page | 235 236 237 238 239 240 241 242 243 244 245 246  | Next Page >

  • How to check Cookie header line and custom cache on Nginx

    - by user124249
    I am trying cache for my website use Nginx Proxy module and has following requirements: If request has cookie (in request header) The response will use cache of Nginx Hide Set-Cookie header line If request has no cookie (in request header) Foward request to backend Don't hide h Set-Cookie header line I use If (of rewrite module) and any directive: if (!-e $http_cookie) { set $not_cache_rq 0; set $not_cache_rp 0; } if ($http_cookie) { set $not_cache_rq 1; set $not_cache_rp 1; } proxy_cache_bypass $not_cache_rq; proxy_no_cache $not_cache_rp; proxy_hide_header Set-Cookie; I do not know how to call cookie proxy_hide_header option when has cookie and no cookie on header line. Please help me. Many thanks.

    Read the article

  • Dynamic form creation from XSD in ASP.NET

    - by CitadelCSAlum
    I know there is a lot of documentation on the internet as far as XSD to forms, but I have not been able to come across one that is straight forward enough for my situation. I am working with a WCF web service that is going to fetch and .xsd xml schema, and must return the HTML of a form based on the .xsd xml schema. Is there any third party tools that can help out with this, if so what are they? If not, do you have any suggestions,better methods,etc for how this can be done?

    Read the article

  • IIS App Pool Identity Internet Settings

    - by Programming Hero
    How does an IIS App Pool determine its Internet Settings? I'm specifying a custom identity under which to host a .NET web application, a service account that is part of our Active Directory domain. When the application runs, it needs to make HTTP requests to other servers. This action causes it to read web and proxy settings from some location, but I can't understand where it goes for this information. Does it look: At the default account's settings on that box? At the default profile on the AD server? Its own local/roaming profile? A combination of the above? Somewhere completely different?

    Read the article

  • College wifi works easily on Linux, but not on Windows

    - by user52849
    In Linux: After connecting to the college wifi, going to the network login page logging in, the internet works perfectly as it should. In Windows: After connecting to the college wifi, going to the network login page, logging in, Windows shows "Internet access" and the wireless icon turns white. But still after that, regardless of the browser being used, attempting to accessing any page just shows "Sending request". It does work though after a lot of tries, but only in intervals. But when running Ubuntu 11.10 in VirtualBox, it works properly just like booting in Ubuntu, even if it isn't working on Windows. The college wifi service is really crappy and has been unable to solve this problem. I'm pretty sure there should be a solution for this, but what? What is it that Ubuntu is doing right and Windows isn't? Windows settings set to "Automatically detect settings" and no proxy server used.

    Read the article

  • Wyse Simple Imager. Unable to Create Product Directory

    - by Steve
    I am trying to submit a post on www.technicalhelp.de, but I receive an error: Invalid Session. Please resubmit the form. This happens if I delete temporary internet files and log out and back in, and if I use a different browser, and if I use a proxy browser. Perhaps someone on this forum can help I am trying to push a Wyse device image to a USB thumb drive. The image is on a remote server, and the thumb drive is connected to my desktop PC. I am using Wyse Simple Imager to do this. When I select the following: Product: V90 Image Version: 5.010627.512 Image File: \servername\folder\OLD_Rapport\V90-withusb\9V90.i2d Almost instantly, without attempting any action, I receive the message: WyseImager Unable to Create Product Directory. Add Image Failed I have completely formatted the USB drive with FAT32. It is new out of the fox, and I can create folders in it. How do I fix this?

    Read the article

  • 2 workstations won't connect to most websites, but will connect to some

    - by Dean
    I have a very frustrating issue I wasn't able to solve: 2 workstations which are used by the same user are not able to connect to most websites receivin a timeout, however they will load some websites specificly from my country. They are able to get the website addresses via DNS. Both stations have their internet connection through a remote router. Other stations in the same LAN are connecting fine. Here's what I tried: Virus scan Renewing IPs Reseting the workstations Moving one workstation to a different RJ-45 in the wall Reseting the hub and switch Checking the hosts file DNS flush Nothing seems to help. I am preparing a CD with more AV tools to see if there's anything hiding on the stations. UPDATE: It was an incorrect configuration in "Internet Options". I configured the correct proxy and now it works.

    Read the article

  • Reasonably Secure Alternative to Poptop PPTP Server for Ubuntu server and Windows clients?

    - by wag2639
    I have a poptp server running on a old Fedora server but I'm upgrading to an Ubuntu 10.04 server. I was wondering if there are any good, reasonable secure alternatives to poptop that in can install on our new Ubuntu server as a way to get VPN access from Windows clients (XP and 7) to get remote access into our Intranet. We only use the VPN to access files located inside the network; we do not need to use it as a proxy/gateway. I've looked into openVPN but it seemed way too complicated and I would prefer something built into Windows. A Windows 7 only solution is OK.

    Read the article

  • Two domains accessing same folder

    - by Liam Quinn
    I've just taken a new role in a school and am still familiarizing myself with their network, how ever I have recently been given a task and I'm having a little trouble finding out the fundementals of it. I have an admin network/domain 10.49.x.x and a classroom network/domian 192.168.1.x both connect to a Proxy server 10.49.202.231/192.168.1.51. Each domain has it's own shared folders as you'd expect, files and software installs etc, how ever there is a folder "staff" on the classroom network that all the teachers on the classroom network can access. The users on the admin network would like to access this same folder. How do I go about making this happen?

    Read the article

  • NHibernate auditing in disconnected mode

    - by Ciaran
    I'm developing an app with a Silverlight UI, transferring my domain objects over WCF and persisting them via NHibernate. I'm therefore working with NHibernate in a disconnected mode. I'm already using the NHibernate PreUpdate and PreInsert EventListeners to perform some metadata operations (updating Create/Update date, created/updated by etc) and they are working fine. I now have a requirement to perform data logging on some of my domain objects. So I will need to have an audit table that has a before-save and after-save state of certain entities. I had wanted to use the @event.Persister.OldState and @event.Persister.NewState to perform this logging, but because I am in a disconnected scenario (using different Sessions from when data is retrieved to when it is persisted), @event.Persister.OldState is null when I am saving my changes back to the database. How is anyone else doing data logging in a disconnected scenario with NHibernate?

    Read the article

  • Is Apache ReverseProxy to Passenger Standalone an acceptable production deployment?

    - by davetron5000
    I have the need to deploy Rails 3 apps, using RVM and gemsets, and am expecting “public” traffic (i.e. this is not an internal-only app). I also must use Apache as the public interface to my app. I understand that Passenger Standalone can help accomplish the rails/RVM end, and I have successfully set it up in my development environment. My question is how viable this setup is for a production deployment. Is deploying via Apache configured to ReverseProxy to my passenger-powered Rails app going to create problems? Since I'm designing the production deployment now, I want to understand if I should spend the additional time to set up Passenger connected to Apache and have that Passenger communicate with Passenger Standalone instance running my Rails app. So, I'm looking for one of I guess three answers: Apache Reverse Proxy to Passenger Standalone will be generally fine You should not use the Apache/Passenger Standalone configuration, but set up Passenger on the Apache side as well Your entire setup is just Wrong, please RTFM (and include link to "FM")

    Read the article

  • Which Visual Studio 2010 edition for sole developer

    - by bufferz
    I am the sole .net developer for a small company. My projects span many .net technologies including WinForms, WPF, SQL, XNA, Linq, WCF, WTF?, and others. I struggle staying on top of all these projects so I'm looking to make my life easier with the release of VS2010. Without a mentor I rely heavily on StackOverflow and whatever else Google comes up with. Should I convince my company to get an edition with an MSDN subscription? Is it one of those things where once you have it, you can't imagine life without it? What about the source control that comes with VS2010, do you all find it better than an SVN server? We're looking to hire another programmer this year, would I be best off getting a Team edition of VS2010 to be best prepared for that hire? Thanks!

    Read the article

  • Ubuntu 9.10 Dowload Speed Very Slow

    - by Don
    I'm running Ubuntu 9.10 desktop and I'm new to the Linux world, so bear with me. I'm on a corporate network of 3 T1s shared across 50-60 users. I typically get about 300 KB/sec for downloads, but for whatever reason, the Linux box will start out in that range, then drop to less than 1KB/Sec sometimes. Doesn't seen to matter where I'm downloading from. Right now I'm trying to get Eclipse for PHP and it's running at 3-6KB/sec. Getting the updates for the system will also drop to very slow rates. Our IT person has set up the machine to get the same 10.0.0.x address when it starts, and moved this IP to bypass our Proxy/Firewall going out, so that shouldn't be the issue. Can anyone recommend something I can try to better diagnose the problem. Again, I'm new to the Linux world and the hardware/OS setup side in general (coming form more of a coding background). Thanks for any advice.

    Read the article

  • Is multithreading the right way to go for my case?

    - by Julien Lebosquain
    Hello, I'm currently designing a multi-client / server application. I'm using plain good old sockets because WCF or similar technology is not what I need. Let me explain: it isn't the classical case of a client simply calling a service; all clients can 'interact' with each other by sending a packet to the server, which will then do some action, and possible re-dispatch an answer message to one or more clients. Although doable with WCF, the application will get pretty complex with hundreds of different messages. For each connected client, I'm of course using asynchronous methods to send and receive bytes. I've got the messages fully working, everything's fine. Except that for each line of code I'm writing, my head just burns because of multithreading issues. Since there could be around 200 clients connected at the same time, I chose to go the fully multithreaded way: each received message on a socket is immediately processed on the thread pool thread it was received, not on a single consumer thread. Since each client can interact with other clients, and indirectly with shared objects on the server, I must protect almost every object that is mutable. I first went with a ReaderWriterLockSlim for each resource that must be protected, but quickly noticed that there are more writes overall than reads in the server application, and switched to the well-known Monitor to simplify the code. So far, so good. Each resource is protected, I have helper classes that I must use to get a lock and its protected resource, so I can't use an object without getting a lock. Moreover, each client has its own lock that is entered as soon as a packet is received from its socket. It's done to prevent other clients from making changes to the state of this client while it has some messages being processed, which is something that will happen frequently. Now, I don't just need to protect resources from concurrent accesses. I must keep every client in sync with the server for some collections I have. One tricky part that I'm currently struggling with is the following: I have a collection of clients. Each client has its own unique ID. When a client connects, it must receive the IDs of every connected client, and each one of them must be notified of the newcomer's ID. When a client disconnects, every other client must know it so that its ID is no longer valid for them. Every client must always have, at a given time, the same clients collection as the server so that I can assume that everybody knows everybody. This way if I'm sending a message to client #1 telling "Client #2 has done something", I know that it will always be correctly interpreted: Client 1 will never wonder "but who is Client 2 anyway?". My first attempt for handling the connection of a new client (let's call it X) was this pseudo-code (remember that newClient is already locked here): lock (clients) { foreach (var client in clients) { lock (client) { client.Send("newClient with id X has connected"); } } clients.Add(newClient); newClient.Send("the list of other clients"); } Now imagine that in the same time, another client has sent a packet that translates into a message that must be broadcasted to every connected client, the pseudo-code will be something like this (remember that the current client - let's call it Y - is already locked here): lock (clients) { foreach (var client in clients) { lock (client) { client.Send("something"); } } } An obvious deadlock occurs here: on one thread X is locked, the clients lock has been entered, started looping through the clients, and at one moment must get Y's lock... which is already acquired on the second thread, itself waiting for the clients collection lock to be released! This is not the only case like this in the server application. There are other collections which must be kept in sync with the clients, some properties on a client can be changed by another one, etc. I tried other types of locks, lock-free mechanisms and a bunch of other things. Either there were obvious deadlocks when I'm using too much locks for safety, or obvious race conditions otherwise. When I finally find a good middle point between the two, it usually comes with very subtle race conditions / dead locks and other multi-threading issues... my head hurts very quickly since for any single line of code I'm writing I have to review almost the whole application to ensure everything will behave correctly with any number of threads. So here's my final question: how would you resolve this specific case, the general case, and more importantly: aren't I going the wrong way here? I have little problems with the .NET framework, C#, simple concurrency or algorithms in general. Still, I'm lost here. I know I could use only one thread processing the incoming requests and everything will be fine. However, that won't scale well at all with more clients... But I'm thinking more and more to go this simple way. What do you think? Thanks in advance to you, StackOverflow people which have taken the time to read this huge question. I really had to explain the whole context if I want to get some help.

    Read the article

  • Random timeout now and then

    - by KenavR
    Maybe this is a to generic question, but since we have this issue for quite a while now, I give it a shot. We have some applications which use HTTP for the connection between the client (website or fat-client) and the server. The Computer who runs this applications is in a Network behind a firewall and a proxy, the server isn't inside the same network. The problem is that every now and then the https Request times out and depending on the Client the Application "hangs" or does some other funky stuff. The problem is definitely inside our network, because if i try the applications outside our network it works fine. Can you give me a hint where i can most likely find the problem?

    Read the article

  • Squid3 not working. Access denied.

    - by Nitish
    I installed SQUID3 on a Linux machine with two ethernet interfaces (eth0 and eth1). I used the default settings in the squid.conf file and uncommented the two lines acl localnet src 192.168.0.0/16 and http_access allow localnet. eth0 is connected to a router, which provides Internet access. It is assigned an IP 192.168.1.2 by the router. I manually configured eth1 to have an IP address 192.168.5.1. It is connected to a switch. Systems having IP addresses 192.168.5.x are connected to this switch. I ran these two commands for NAT: iptables -t nat -A PREROUTING -i eth1 -p tcp -m tcp --dport 80 -j DNAT --to-destination 192.168.5.1:3128 iptables -t nat -A PREROUTING -i eth0 -p tcp -m tcp --dport 80 -j REDIRECT --to-ports 3128 But when I try to access internet from a system having IP 192.168.5.2 through the proxy I get an error that says "Access denied". What is wrong with my configuration?

    Read the article

  • Workflow Foundation: Asynchronous operations (lengthy network I/O)

    - by StormianRootSolver
    I have to create an application that will be started a few times per day (it's non - interactive). To operate, it needs LARGE amounts of data from the Internet (megabytes) via a rather slow connection, so the WCF service calls take quite some time. At the same time, it needs to perform local calculations and has a sophisticated initialization process. So, what I want to do is to create a workflow that asynchronously fetches the data (takes a few minutes) while already initializing / calculating locally. Is there a way to accomplish this?

    Read the article

  • Cannot access internal website after being connected to other network

    - by Dandroid
    I have a client who claim they cannot access their internal website when they have e.g. been out traveling. They have to reset their browser settings every time to be able to get access again. As they live on another continent and timezone it's hard for me to run live tests with them to see what changes in their browser settings. My first guess would be it's some sort of Proxy related issue, but I want to know if there could be other reasons for this? It's not the LAN itself, that we are sure of. It's browser specific only. Edit It's worth mentioning that it only happens when they turn off/restart their computers.

    Read the article

  • What Windows Form control would be a good fit for this use case?

    - by Sergio Tapia
    I'm going create an open source Help desk solution free of charge for small to medium businesses to use. I'm currently working on the client application. I want to have a list of tickets that have been opened by the user. So it would be like a table TicketsByUser: Ticket Number | Type | Description | Date | Handled? 123456 | Hardware | My mouse broke | 10/20/2010 | No 123456 | Hardware | My mouse broke | 10/20/2010 | Yes I was thinking of using ListView because of it's name, but I have zero experience with it, so maybe it's not what I'm looking for. I'm going to be pulling the data from a WCF service which in turn pulls it from a MS SQL database.

    Read the article

  • Forward Request to Multiple Servers

    - by cactuarz
    We have 2 servers. One is old server and another is the new one. Currently we about doing a migration because the old server is not capable enough to handle everyday requests. The specs are: Old server Ubuntu 10.04 Nginx as Reverse Proxy Apache WSGI Python/Django New Server Ubuntu 10.04 Nginx Gunicorn Python/Django Celery+Redis Our manager asked us to research if the old server can perform multiple forwarding to all incoming request, for example, set Nginx of old server to forward all request to both old and new server. The purpose is to perform unit testing to new server using old server as comparer, see if the new server is ready to take over the role. Please help, if there is an idea, or must install some engine, or what we do is impossible. Many thanks.

    Read the article

  • Restrict only some plugins to specific sites in Google Chrome

    - by Christian
    I am looking for a way to set up Google Chrome so that it will run a certain plug-in (Java, what else?) only on whitelisted sites, but other plug-ins (like the PDF viewer) everywhere. From playing with the policies available for Chrome, I think there are basically two levels of plug-in management: List of disabled plugins/enabled plugins: Controls whether a plug-in exists for the browser at all This pair of policies applies to plug-ins, but not to sites. Default plug-in settings/Allow plug-ins on sites: Controls on which sites plug-ins can run This set of policies applies to sites, but not to individual plugins, and it cannot override the first pair. There appears to be no way to configure Chrome so that some plug-ins only run on whitelisted sites, but others run everywhere by default. I have also looked at filtering content on the firewall/proxy level, but I'm not convinced it can be done securely there. Filtering by URLs (file names) or content types can be circumvented trivially, and identification by content inspection cannot be safe either.

    Read the article

  • Problem with the hosts file under windows 7

    - by martani_net
    I updated some entries in the hosts file "C:\WINDOWS\System32\drivers\etc" to make google for example point to 127.0.0.1 # Additionally, comments (such as these) may be inserted on individual # lines or following the machine name denoted by a '#' symbol. # # For example: # # 102.54.94.97 rhino.acme.com # source server # 38.25.63.10 x.acme.com # x client host 127.0.0.1 localhost ::1 localhost 127.0.0.1 google.com This works fine under windows Vista, but not under Widows 7. When I type google, it goes directly to Google's website. For info, I am not using a proxy server. I think there are some temporary DNS settings that must be flushed, but I don't know how, anyone knows how to fix this? Thank you.

    Read the article

  • Gmail/Facebook/Hotmail not opening in Firefox/IE on Windows 7 Home

    - by singlepoint
    Hi, I am unable to open Gmail/Facebook/Hotmail on Firefox/IE on Widows 7 Home. I just unboxed a brand new hp laptop with Norton Security Suite running inside. I get following error message on Firefox. Please help. The connection has timed out The server at www.google.com is taking too long to respond. * The site could be temporarily unavailable or too busy. Try again in a few moments. * If you are unable to load any pages, check your computer's network connection. * If your computer or network is protected by a firewall or proxy, make sure that Firefox is permitted to access the Web

    Read the article

  • Fully secured gateway web sites

    - by SeaShore
    Hello, Are there any web sites that serve as gateways for fully encrypted communication? I mean sites with which I can open a secured session, and then to exchange through them with other sites in a secure way both URLs and content? Thanks in advance. UPDATE Sorry for not being clear. I was wondering if there was a way to access any site over the Internet (http or https) without letting any Intranet-proxy read the requested URL or the received content. My question is whether such a site exists, e.g.: I am connected to that site via https, I send it a URL in a secured way, the site gets the content from the target site (possibly in a non-secured way) and returns to me the requested content in a secured way.

    Read the article

  • SSH Tunnel doesn't work in China

    - by Martin
    Last year I was working in China for a few months. I never bothered setting up a real VPN, but just created a SSH tunnel, and changed my browsers proxy settings to connect through it. Everything worked great (except flash of course) but that was fine. However, now I'm back in China but I'm having problems with this approach. I do the same thing as last time, and according to https://ipcheckit.com/ my IP address is indeed the IP of my (private) server in the US, and I'm logging in to my server using a fingerprint I created long before going to China so no MITM should be possible. Furthermore the certificate from ipcheckit.com is from GeoTrust - so everything should be OK However, I still can't access sites which are blocked in China. Any idea how this could be possible?

    Read the article

  • debian installation without internet connection

    - by Gobliins
    Hi i want to install some Debian distributions (Grip, Crush, Lenny...) for arm / armel architectures. www.emdebian.org/ i refer to this guide www.aurel32.net/info/debian_arm_qemu.php The Problem i have is that i dont have internet connection with My Linux VM or Qemu i am behind a Proxy. I want to know is there a way where i can dl all the needed files and save them to disk that i don´t need an i.c. during the installation? I am working under Windows now. my regards

    Read the article

< Previous Page | 235 236 237 238 239 240 241 242 243 244 245 246  | Next Page >