Search Results

Search found 19625 results on 785 pages for 'local groups'.

Page 162/785 | < Previous Page | 158 159 160 161 162 163 164 165 166 167 168 169  | Next Page >

  • HTTP traffic through PIX VPN from outside site

    - by fwrawx
    I have a remote site with a website that only allows access from the outside IP assigned to our local PIX. I have users connecting to the local networking using a VPN that need to be able to view this remote site. I don't think this works because the packets want to come in and go out over the same (ext) interface. So I'm looking for a way to make this work using the PIX or setting up a service on a server on the local network to act as a middle-man for the HTTP requests. The remote site doesn't support setting up a VPN to our PIX. The remote website is dishing out pages over a non-standard port. Can I use squid or something similar to proxy just one site?

    Read the article

  • How to setup external mail addresses without external autodiscover tries?

    - by Tarnschaf
    We have a little Exchange/Outlook installation here that fetches the mails from our provider with POP3. Now to be able to send emails outside our organisation, I added another SMTP address to the Exchange User: [email protected] (Default / Reply Address) boss@company.local Sending email works using the default address. But now there is an error message each time we start Outlook. Outlook tries to autodiscover using autodiscover.ourcompany.com which doesn't exist. Our autodiscover files are placed on our local server. I think all the servers are discovers, because everything works as expected. Everything except the error message on each Outlook start. (The error message is actually because of an invalid certificate but I don't see why Outlook should contact an external host at all!) So how can I solve this? Forcing Autodiscover on every Outlook client to use the local hosts? Or ist there an even better way?

    Read the article

  • Page Spamming via locations

    - by codemonkey
    Hi guys I am new here so please be gentle :) I have created a web page for a small mail order business. The page asks the reader if they are in need of a supplier for products in their "area" and if they have ever been let down by a supplier in that "area" etc. It also lists all the local villages and hamlets around the [area] where they can also supply too. This page is dynamically created and the [area] changes and so do the small towns that are local to the town. The page also contains information on the products so the word count vs town names is not stupid. An example of one of the URL would be www.website.com/1014/Halesowen/ It basically covers the whole of the UK so around 800 main towns with 28,000 local villages. The URL changes, so does the title and h1 tags, also each page is Geo coded for that town. My question really is this a good or bad idea? Is it a black hat technique ? I have been told if I have to ask the question then it probably is but the site does supply to all these areas just as any mail order company does and would like to get listed higher in each town for the products. I have seen this done on a few sites but only with a few targeted towns and not the whole of the UK so I would be really interested in your guys thoughts on this. I would post the URL to the site but as I am new here I am a bit unsure of the rules regarding posting links. The whole site needs a lot of other onsite SEO work doing and I will be doing that over the next few weeks. I look forward to your views on this. p.s. If I am allowed to post the URL without getting into trouble so you can see it someone let me know? Thanks in advance

    Read the article

  • Bazaar - pull the last revision only (and not the whole branch)

    - by Sandman4
    Shortly: How can I take the latest revision (only) from a remote bazaar repository and add it as a new revision to a local repository. Background: I have a development system and a production system. On a development system there's a bazaar repository having branch with lots of development revisions. Once in a while I want to incorporate the latest developments into production system. I want to do so by some sort of "pulling" (development system can not connect to production for security reasons, but production can initiate connection to development). On the production, I don't want the whole development revision history, only those revisions which actually go into production (normally it's the branch tip). Yet I want version control on the production system to keep track of what actually goes into production each time. bzr pull pulls the whole branch. bzr pull --revision=last:1 also pulls the whole branch, up to the specified revision. bzr merge --pull --revision=last:1 also pulls the whole branch. bzr merge --pull --revision=last:2..last:1 and bzr merge --pull --change=last:1 both pull only the new changes introduced in the latest revision, but not changes introduced in the older revisions. With lightweight checkout I have no track of revisions which are pulled into production - local working tree remains part of the remote repository The only way I see so far is importing the working tree using some rsync or scp and committing them to a local branch afterwards. Any better ideas ?

    Read the article

  • mounts aren't case-sensitive

    - by Asi
    I mounted a few drives from Linux boxes in my network, but those mounts aren't case-sensitive. The mount command I used ( from the man mount.cifs, case-sensitive should be the default ): mount //10.0.1.10/remote_folder /local_folder -t cifs -o username=xxxx,password=xxxx but those mounts aren't sensitive. for example doing: ls -l /local_folder/testfile.txt ls -l /local_folder/TESTFILE.TXT give's the same result... instead of 'file not found' Couple of important points: All drives are running on Linux machines. My local machine is running Fedora 18 and it is case-sensitive for ANY folder/file expect the mounted drives. All drive/mounts are case-sensitive when when doing SSH. So if I SSH from my local machine to a remote machine, doing ls -l /local_folder/TESTFILE.TXT will say file not found as it should. So I believe the issue is in my local machine and not in the way I did the mount. but I'm not sure where to look next (I'm new to Linux)

    Read the article

  • Automatically mount a remote folder on boot

    - by Andrew
    I'm trying to mount a Windows folder on my Ubuntu machine on start up. I've tried following this page here, modifying /etc/fstab and appending sshfs#my_user@remote_host:/path/to/directory <local_mount_point> fuse user 0 0 to it, but it fails; on start up, I get an error saying that the mounting failed, and I can press S to skip or M to recover manually. I also tried following this page here, appending /usr/bin/sshfs -o idmap=user my_user@remote_host:/path/to/directory <local_mount_point> to the /etc/rc.local file, but this doesn't help either; Ubuntu just boots up normally without mounting. I have Cygwin installed on my Windows machine, and I can run everything smoothly, such as sshing without passwords, and mounting it manually. I've also tried to run the modified rc.local file $ /etc/rc.local, and it works perfectly, but I just can't seem to get the folder mounted on start up. Can someone help me?

    Read the article

  • Problem about IP and computer name in Ubuntu

    - by bugbug
    I can't connect to mysql database becase it alway change 192.168.1.101 to ubuntu.local. $ mysql -uroot -padmin1234 -h192.168.1.101 ERROR 1045 (28000) : Access denined for user 'root'@'ubuntu.local' (using password: YES) How do I solve this problem. File: /etc/hosts in this machine 127.0.0.1 localhost 127.0.1.1 ubuntu.ubuntu-domain ubuntu # The following lines are desirable for IPv6 capable hosts ::1 localhost ip6-localhost ip6-loopback fe00::0 ip6-localnet ff00::0 ip6-mcastprefix ff02::1 ip6-allnodes ff02::2 ip6-allrouters ff02::3 ip6-allhosts I have no idea about "root'@'ubuntu.local", where is it come from.

    Read the article

  • Why do I get "permission denied" errors with Python easy_install?

    - by ATMathew
    I'm an Ubuntu newbie and have been trying to install python's easy_install so that I don't have to deal with source files when install Python libraries. I"ve ran the following, and it seems to install the correct applications: sudo apt-get install python-setuptools However, when i run easy_install sqlalchemy or easy_install pysqlite3, it doesn't work. I get the following error message: install_dir /usr/local/lib/python2.6/dist-packages/ error: can't create or remove files in install directory The following error occurred while trying to add or remove files in the installation directory: [Errno 13] Permission denied: '/usr/local/lib/python2.6/dist-packages/test-easy-install-1674.pth' The installation directory you specified (via --install-dir, --prefix, or the distutils default setting) was: /usr/local/lib/python2.6/dist-packages/ Perhaps your account does not have write access to this directory? If the installation directory is a system-owned directory, you may need to sign in as the administrator or "root" account. If you do not have administrative access to this machine, you may wish to choose a different installation directory, preferably one that is listed in your PYTHONPATH environment variable. For information on other options, you may wish to consult the documentation at: http://packages.python.org/distribute/easy_install.html Please make the appropriate changes for your system and try again. Help! Abraham

    Read the article

  • Setting up thttpd to run vqadmin or qmailadmin...keep getting 404s

    - by Ian
    I run nginx for my web server but wanted to quickly toss up thttpd so I could do some maintainenace using either vqadmin or qmailadmin. Those files are located at: /usr/local/apache/cgi-bin/qmailadmin and /usr/local/apache/cgi-bin/vqadmin/vqadmin.cgi. My /etc/thttpd.conf is: host=127.0.0.1 port=8000 user=apache logfile=/var/log/thttpd.log pidfile=/var/run/thttpd.pid dir=/usr/local/apache/cgi-bin nochroot cgipat=**.cgi When I use lynx to go to http://127.0.0.1:8000/cgi-bin/vqadmin/vqadmin.cgi, thttpd tosses a 404. Any idea how to get this working? Many thanks.

    Read the article

  • Significant OS X Finder lag when listing directories/files

    - by Jack Sleight
    I'm experiencing some significant OS X finder lag, that seems to be purely an issue with Finder itself, and not the HD or any other part of OS X (I'll explain below). The lag only appears to be when listing directories/files, where I'm seeing up to twelve or so seconds of lag (the folder opens with a blank list and the spinner going in the bottom right). This happens with both the local SSD and network drives (connected via ethernet or wifi) Browsing both local and network drives in terminal and listing directories is instant I can actually browse files on my NAS from my phone over a 3G connection from the other side of the country faster than Finder can when connected to the local network (madness!) Can anyone help? Thanks.

    Read the article

  • Git and Amazon EC2 public key denied

    - by MrNart
    I had git working before on /var/html/projectfolder and realized it was a security risk so I made a new folder /projects from the root folder and tried to replicate what I did and now it doesnt work. Here is the backlog of what I did for my local machine and EC2 - server Server-EC2 1.I added my public key to the authorized_user file in ~/.ssh folder 2.Create a bare repository git init --bare 3.Change folder permissions to sudo chgrp -R ec2-user * sudo chmod -R g+ws * Local Machine create a local repository with git init touch, add, commit readme file pointed origin master to ec2 via git remote add origin ssh://ec2-user@remote-ip/path/to/folder This is my output: Permission Denied (publickey) fatal: The remote end hung up unexpectedly

    Read the article

  • Static IP Address on Ubuntu 12.04 Virtual Machine

    - by chrisnankervis
    I've setup a VM running Ubuntu 12.04 specifically for local web development and am having some problems ensuring it has a static IP address. A static IP address is important as I'm using the IP address in my hosts file to assign a .local suffix to addresses used both in browser and to connect to the correct database on the VM. Currently, every time I connect to a new network or my VM is assigned a new IP address I need to reconfigure my whole environment which is becoming quite a pain. It also probably doesn't help that the default-lease-time on the Ubuntu VM is set to 1800 by default. At the moment I'm using VMWare Fusion and the Network Adapter is enabled and set to "Autodetect" under Bridged Networking. I've tried to set a static IP address within the dhcpd.conf using the code below: host ubuntu { hardware ethernet 00:50:56:35:0f:f1; fixed-address: 192.168.100.100; } The fixed-address that I've used is also outside the range specified in the subnet block (which in this case is 192.168.100.128 to 192.168.100.254). I've tried adding and removing the network adapter and restarting my Mac after each time to no avail. Below is an ifconfig of the VM that might be of some help: eth0 Link encap:Ethernet HWaddr 00:50:56:35:0f:f1 inet addr:192.168.0.25 Bcast:192.168.0.255 Mask:255.255.255.0 inet6 addr: fe80::250:56ff:fe35:ff1/64 Scope:Link UP BROADCAST RUNNING MULTICAST MTU:1500 Metric:1 RX packets:1624 errors:0 dropped:0 overruns:0 frame:0 TX packets:416 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:1000 RX bytes:147348 (147.3 KB) TX bytes:41756 (41.7 KB) lo Link encap:Local Loopback inet addr:127.0.0.1 Mask:255.0.0.0 inet6 addr: ::1/128 Scope:Host UP LOOPBACK RUNNING MTU:16436 Metric:1 RX packets:0 errors:0 dropped:0 overruns:0 frame:0 TX packets:0 errors:0 dropped:0 overruns:0 carrier:0 collisions:0 txqueuelen:0 RX bytes:0 (0.0 B) TX bytes:0 (0.0 B) Are there any specific issues with 12.04 that I'm missing? Otherwise has anyone else got any ideas? Thanks in advance.

    Read the article

  • Use Port Binding Permissions on Windows

    - by Sharon
    This should be an easy one, but I can't find anything on it. I want to use IIS Express with my local user account to bind to a port on my netbios name. For example, http://computername:1315. My local user account doesn't have permission to do this, but I have administrator access on the machine. Anyone know how to grant permission to my local user account to bind to a port with my computer name instead of localhost? This is on Windows 7.

    Read the article

  • Scaling along an arbitrary axis (Dealing with non-uniform scale)

    - by Jon
    I'm trying to build my own little engine to get more familiar with the concepts of 3D programming. I have a transform class that on each frame it creates a Scaling Matrix (S), a Rotation Matrix from a Quaternion (R) and concatenates them together (S*R). Once i have SR, I insert the translation values into the bottom of the three columns. So i end up with a transformation matrix that looks like: [SR SR SR 0] [SR SR SR 0] [SR SR SR 0] [tx ty tz 1] This works perfectly in all cases except when rotating an object that has a non-uniform scale. For example a unit cube with ScaleX = 4, ScaleY = 2, ScaleZ = 1 will give me a rectangular box that is 4 times as wide as the depth and twice as high as the depth. If i then translate this around, the box stays the same and looks normal. The problem happens whenever I try to rotate this scaled box. The shape itself becomes distorted and it appears as though the Scale factors are affecting the object on the World X,Y,Z axis rather than the local X,Y,Z axis of the object. I've done some pretty extensive research through a variety of textbooks (Eberly, Moller/Hoffman, Phar etc) and there isn't a ton there to go off of. Online, most of the answers say to avoid non-uniform scaling which I understand the desire to avoid it, but I'd still like to figure out how to support it. The only thing I can think off is that when constructing a Scale Matrix: [sx 0 0 0] [0 sy 0 0] [0 0 sz 0] [0 0 0 1] This is scaling along the World Axis instead of the object's local Direction, Up and Right vectors or it's local Z, Y, X axis. Does anyone have any tips or ideas on how to handle construction a transformation matrix that allows for non-uniform scaling and rotation? Thanks!

    Read the article

  • High disk time on sql-server

    - by Patrik
    Hi We have a dedicated sql-server 2008 r2 enterprise edition. The setup is: D: (data files) - stored on local ssd disks (not the same disks as log files) (raid 10) E: (log files) - stored on local ssd disks (not the same disks as data files) (raid 1) F: (transaction log backup) - stored remote on a SAN Today we moved our log files to new disks (from F: to E:). From a shared volume ( F:(SAN)) to dedicated local disks (E:). What then happend was that the "disk time", "avg. transfer time" and "avg disk write queue length" increased on the volume where we have the data files (D:) (not on the volume where the log files are located). The data volume and log volume does not share disks, however they share the same controller card. "Disk idle time" is low for all volumes. One thought is ofcourse that the controller card might be overloaded. But, we need more ideas on where the problem might be.

    Read the article

  • How should I determine if a user is logged in graphically while lightdm is running?

    - by Jack
    I want to know if someone is logged into a local X-session. In the past I looked at the output of ck-list-sessions. The output looked something like this: Session12: unix-user = '[redacted]' realname = '[redacted]' seat = 'Seat1' session-type = '' active = TRUE x11-display = ':0' x11-display-device = '/dev/tty8' display-device = '' remote-host-name = '' is-local = TRUE on-since = '2012-10-22T18:17:55.553236Z' login-session-id = '4294967295' If no one was logged in, there was no output. I checked if someone was logged in with ck_result" string => execresult("/usr/bin/ck-list-sessions | /bin/grep x11 | /usr/bin/cut --delimiter=\\' -f 2 | /usr/bin/wc -w This no longer works, because lightdm greeter looks like a logged in user Session12: unix-user = '[redacted]' realname = 'Light Display Manager' seat = 'Seat1' session-type = 'LoginWindow' active = TRUE x11-display = ':0' x11-display-device = '/dev/tty8' display-device = '' remote-host-name = '' is-local = TRUE on-since = '2012-10-22T22:17:55.553236Z' login-session-id = '4294967295' I guess I could check session-type, but I don't know how to do that and check x11-display in one-liner. I then need to write my own script, but at that point I thought I would check if anyone else has already done the work or if there is a way to get ConsoleKit to tell me what I want (or if I should be using a different tool)?

    Read the article

  • Is it possible to keep nm-applet running between invocations of WM startup?

    - by serverninja
    I am using nm-applet to interface with NetworkManager, running xmonad as a window manager. My X sessions (including nm-applet) are set up with a /usr/local/bin/xmonad.start script. My question is, how can I keep nm-applet running in the background as long as X is running, but not necessarily xmonad? As mentioned above, it is being started with xmonad (and dying with it when xmonad is restarted, etc). I am using gdm to manage my X sessions, and I'm running 10.10. Where's a good place to start nm-applet to suit my particular needs? I need to remove it from the control of xmonad, but don't know where to start it otherwise. Any help, tips, etc appreciated. Edit: problem seems to be with how I have integrated xmonad. I have the session script as a file in /usr/share/xsessions/xmonad.desktop with the following contents: [Desktop Entry] Encoding=UTF-8 Name=XMonad Comment=Lightweight tiling window manager Exec=/usr/local/bin/xmonad.start Icon=xmonad.png Type=XSession /usr/local/bin/xmonad.start contains the following: #!/bin/bash xrdb -merge ~/.Xresources xcompmgr -c & trayer --edge top --align right --SetDockType true --SetPartialStrut true --expand true --width 8 --heighttype pixel --height 18 --transparent true --alpha 0 --tint 0x000000 & gnome-settings-daemon & gnome-screensaver & if [ -x /usr/bin/nm-applet ] ; then nm-applet --sm-disable & fi /usr/bin/urxvtd -q -o -f & eval `ssh-agent` & if [ -x /usr/bin/gnome-power-manager ] ; then sleep 1 gnome-power-manager & fi /usr/bin/gnome-volume-control-applet & exec xmonad The question is how do I integrate xmonad, gdm, X, etc in such a manner to replicate the behavior I currently have except with nm-applet (and possibly other programs) running whether or not xmonad is?

    Read the article

  • Media player only works as administrator?

    - by Jeremy
    It seems I can only get Media Player 12 to work as administrator. If I run it normally (I am in the administrator group on my local PC) and right click on Music, and choose Manage Music Library. Media Player will sit and think for 5 or so seconds, then just not do anying, no dialog, no error. If I run as administator I can now get into the Manage Music Library dialog and add my a public folder containing my music. I've even tried granting everyone access to the public folder. One thing to note is that I have recently set up a domain controller and added my PC to the domain. With my local account I never noticed this problem, but I've since created a domain account and am now seeing this issue. I can't find much difference between the local and domain accounts - both are in the administrator group. Why would WMP require run as administrator? OS, Windows 7 64bit

    Read the article

  • Windows Server vs Sharing

    - by Mark Lawrence
    Not sure if this is the right place for this question. I have a friend that owns a small business running 3 local machines, that are connected to the internet, and a server that each local machine connects to. He has recently bought a newer server, and by server I mean a Windows Vista box. He wants to use this purely to store data that the 3 local machines can access, kind of like a glorified external hard drive. Im suggesting to fore-go the server option and simply setup sharing on the 'server' box. Interested in hearing any suggestions as to whether the server idea is preferable?

    Read the article

  • What disk setup is needed / best practice for hypervisor-only servers?

    - by Luke404
    Planning to buy some servers to run an hypervisor (Citrix XenServer or VMware vSphere, still have to decide between the two) we'd like to boot off the local redundant SD card module offered by various vendors (eg. Dell, HP, etc...). The actual VMs will run from an existing iSCSI SAN (which, by the way, can't support booting the servers directly off the SAN). What are the reasons, if any, to choose completely diskless servers VS having some local storage? And what would be the guidelines to choose that local storage? (number of spindles, raid level, etc)

    Read the article

  • Precise: Evolution laggy due to IMAP -profile or due to some odd Sync -issue?

    - by Izzy
    I'm fighting with Evolution. Basically it's working fine -- but it is very slow to react in certain situations. There is apparently some problem with syncing and IMAP. Helper questios Could be that changing away from Bonobo has to do with slowing-down? There might be some trouble with the new engine and "asynchronous actions". What to do about it? I want to get the previous "working mood" back. How can I speed this thing up? Different scenarios when sending a mail, the composer window hangs there inactive for a couple of seconds, everything grayed out. Though there is a green check mark saying it's sent, I'm not sure a) why it's still blocking everything and b) whether I could simply close it without "breaking"/"losing" anything. In earlier versions, the composer window was closing pretty fast, and one could see the message being stored into the local "outbox" until it was sent, and one could immediately continue with the next task. I prefer that behaviour over the current. switching between modules. Coming from mail and switching to the address book takes a couple of seconds. Same for switching to the calendar. I read about different "possible causes" and tried a few things: I only have 3 local address books, so no networking should be involved here. To make sure, I switched to offline mode and then tried to access the address book. No noticeable difference. I use 3 Google Calendars. Switching to offline mode made a minor difference, but so minor that it also could be "imagination" since one might have expected this in this case according to some reports, disabling the tasks should help. Well, it didn't in my case, as I don't use them regularly (just two local items stored here)

    Read the article

  • What are some efficient ways to set up my environment when working on a remote site?

    - by Prefix
    Hello fellow Programmers, I am still a relatively new programmer and have recently gotten my first on-campus programming position. I am the sole dev responsible for 8 domains as well as 3 small sized PHP web apps. The campus has its web environment divided into staging and live servers -- we develop on the staging via SFTP and then push the updates to the live server through a web GUI. I use Sublime Text 2 and the Sublime SFTP plugin currently for all my dev work (its my preferred editor). If I am just making an edit to a page I'll open that individual file via the ftp browser. If I am working on the PHP web app projects, I have the app directory mapped to a local folder so that when I save locally the file is auto-uploaded through Sublime SFTP. I feel like this workflow is slow and sub-optimal. How can I improve my workflow for working with remote content? I'd love to set up a local environment on my machine as that would eliminate the constant SFTP upload/download, but as I said there are many sites and the space required for a local copy of the entire domain would be quite large and complex; not to mention keeping it updated with whatever the latest on the staging server is would be a nightmare. Anyone know how I can improve my general web dev workflow from what I've described? I'd really like to cut out constantly editing over FTP but I'm not sure where to start other than ripping the entire directory and dumping it into XAMP.

    Read the article

  • Outlook Anywhere remote https connection issue

    - by holian
    We have SBS 2003, and we use DYNDNS. We forward dyndns address 443 to local server ip 443 port. mycompany.dyndns.org:443 -- server.mycompany.local:443 In android phone i can check my mails with Outlook Active Snyc. From remote machine i can check my mails in owa (https://mycompany.dyndns.org/exchange) But i can't set up outlook 2013 to remote connect. I installed server.mycompany.local to remote machine trusted cert container, but i got error message: "There is a problem with the proxy server's security certificate. The name on the security certificate is invalid or does not match the name of the target site. Outlook is unable to connect to the proxy server. (Error Code 10)" Is it possible to connect exchange, via dnydns? Whats the problem? Thank you

    Read the article

  • Xdebug 2.0.5 with Zend Server CE PHP5.2.12 possible?

    - by notbrain
    I'm using Zend Server CE with PHP 5.2.12 on OSX Snow Leopard and want to use Xdebug. I've turned off Zend Data Cache, Zend Optimizer+, and Zend Debugger in the console. When I run $ cd ~/Downloads/xdebug-2.0.5 $ /usr/local/zend/bin/phpize I get Configuring for: PHP Api Version: 20041225 Zend Module Api No: 20060613 Zend Extension Api No: 220060519 The PHP API Version, 20041225, seems to be off from the documentation (aka wrong). When I continue installation with $ ./configure ---with-php-config=/usr/local/zend/bin/php-config $ make $ sudo make install The installed xdebug.so seems to be the wrong one. Which version of xdebug do I need for this PHP API version? The Zend API numbers are ok. I'm just confused at why the PHP API Version doesn't match. PHP Warning: PHP Startup: Unable to load dynamic library '/usr/local/zend/lib/php_extensions/xdebug.so' - (null) in Unknown on line 0 PHP 5.2.12 (cli) (built: Feb 17 2010 13:39:36)

    Read the article

  • reverse proxy http to tomcat

    - by John Q
    I've configured an Apache server with SSL and reverse proxy to a tomcat <VirtualHost domain.com:1443> [...] ProxyRequests Off ProxyPreserveHost On ProxyPass / http://local.com:8080/ ProxyPassReverse / http://local.com:8080 SSLEngine on [...] </VirtualHost> Tomcat is listening on 8080. The issue is that the app on tomcat is redirecting the request (HTTP 302 Moved temporairly). For example, if I use the URL https:// domain.com:1443/folder, reverse proxy launch the request http:// local.com:8080/folder, then, the app redirect to "/subfolder", so the final request is: http://domain.com:1443/folder/subfolder. Result is a 400 Bad request error code, as the request is HTTP on my SSL port. Do you know how I can fix this issue ? Thanks in advance.

    Read the article

< Previous Page | 158 159 160 161 162 163 164 165 166 167 168 169  | Next Page >