Search Results

Search found 21072 results on 843 pages for 'thin client'.

Page 302/843 | < Previous Page | 298 299 300 301 302 303 304 305 306 307 308 309  | Next Page >

  • HTTP Live Streaming Broadcast

    - by user761389
    I'm designing an app for streaming video from a device (e.g. iPhone) via a server to one or more devices and have been researching Apples HTTP Live Streaming protocol. One thing that isn't clear is whether it is possible to stream live video (with audio) to the server and then have it streamed simultaneously in real time to the client devices. From reading the documentation and technical notes from Apple it seems like the index file needs to be created before the segmented video files can be served to a client. Is this right? If so maybe HTTP Live Streaming isn't suitable in this case, what other technologies or software should I consider? Thanks

    Read the article

  • Using iTunes within Terminal Services 2008 R2 - Pitfalls etc

    - by Kristiaan
    I was hoping to get some further information on any possible Do's and Don't when it comes to installing, using and maintaining iTunes within a Termina Server environment. We have come across a situation in our company whereby some of our users who are using thin clients now need the ability to sync, update and manage their devices, previously they used either standard desktop systems or laptops so there was no issue with running iTunes. I have not found much information on the web about using iTunes within a Terminal Server Farm, Id like to find out if iTunes works within the environment, any known or common issues that occur due to running it like this.

    Read the article

  • SSH asks for password

    - by user1435470
    I have already : Installed the server Generated the pub/pri keys with -P "" Copied the id_rsa.pub to authorized_keys ssh localhost answered "yes", copied to known_hosts tried ssh localhost still asks for password Output: hduser@hduser1-desktop:~$ ssh -v localhost OpenSSH_5.3p1 Debian-3ubuntu7, OpenSSL 0.9.8k 25 Mar 2009 debug1: Reading configuration data /etc/ssh/ssh_config debug1: Applying options for * debug1: Connecting to localhost [127.0.0.1] port 22. debug1: Connection established. debug1: identity file /home/hduser/.ssh/identity type -1 debug1: identity file /home/hduser/.ssh/id_rsa type 1 debug1: Checking blacklist file /usr/share/ssh/blacklist.RSA-2048 debug1: Checking blacklist file /etc/ssh/blacklist.RSA-2048 debug1: identity file /home/hduser/.ssh/id_dsa type -1 debug1: Remote protocol version 2.0, remote software version OpenSSH_5.3p1 Debian- 3ubuntu7 debug1: match: OpenSSH_5.3p1 Debian-3ubuntu7 pat OpenSSH* debug1: Enabling compatibility mode for protocol 2.0 debug1: Local version string SSH-2.0-OpenSSH_5.3p1 Debian-3ubuntu7 debug1: SSH2_MSG_KEXINIT sent debug1: SSH2_MSG_KEXINIT received debug1: kex: server->client aes128-ctr hmac-md5 none debug1: kex: client->server aes128-ctr hmac-md5 none debug1: SSH2_MSG_KEX_DH_GEX_REQUEST(1024<1024<8192) sent debug1: expecting SSH2_MSG_KEX_DH_GEX_GROUP debug1: SSH2_MSG_KEX_DH_GEX_INIT sent debug1: expecting SSH2_MSG_KEX_DH_GEX_REPLY debug1: Host 'localhost' is known and matches the RSA host key. debug1: Found key in /home/hduser/.ssh/known_hosts:3 debug1: ssh_rsa_verify: signature correct debug1: SSH2_MSG_NEWKEYS sent debug1: expecting SSH2_MSG_NEWKEYS debug1: SSH2_MSG_NEWKEYS received debug1: SSH2_MSG_SERVICE_REQUEST sent debug1: SSH2_MSG_SERVICE_ACCEPT received debug1: Authentications that can continue: publickey,password debug1: Next authentication method: publickey debug1: Offering public key: /home/hduser/.ssh/id_rsa debug1: Authentications that can continue: publickey,password debug1: Trying private key: /home/hduser/.ssh/identity debug1: Trying private key: /home/hduser/.ssh/id_dsa debug1: Next authentication method: password Any suggestions ? Cheers

    Read the article

  • Windows Vista Home memory usage problem

    - by lordg
    Hi, I have a Windows Vista Home laptop from a client that is running on 1GB ram. The laptop is used for super basic things, word, internet, outlook, etc. What makes zero sense is that the RAM is being completely consumed, causing the PC to hang sometimes when it can't take it anymore. However, in task manager, the processes appear to only be consuming maybe 100MB (Private Working Set). The client literally has a simple setup, and is running kaspersky, though that does not seem to be indicating it is the cause of the excessive memory usage. Does anyone have a suggestion on how to resolve the memory issue or how to track down what is actually happening and fix it? G

    Read the article

  • Depending on fixed version of a library and ignore its updates

    - by Moataz Elmasry
    I was talking to a technical boss yesterday. Its about a project in C++ that depends on opencv and he wanted to include a specific opencv version into the svn and keep using this version ignoring any updates which I disagreed with.We had a heated discussion about that. His arguments: Everything has to be delivered into one package and we can't ask the client to install external libraries. We depend on a fixed version so that new updates of opencv won't screw our code. We can't guarantee that within a version update, ex from 3.2.buildx to 3.2.buildy. Buildy the function signatures won't change. My arguments: True everything has to be delivered to the client as one package,but that's what build scripts are for. They download the external libraries and create a bundle. Within updates of the same version 3.2.buildx to 3.2.buildy its impossible that a signature change, unless it is a really crappy framework, which isn't the case with opencv. We deprive ourselves from new updates and features of that library. If there's a bug in the version we took, and even if there's a bug fix later, we won't be able to get that fix. Its simply ineffiecient and anti design to depend on a certain version/build of an external library as it makes our project difficult in the future to adopt to new changes. So I'd like to know what you guys think. Does it really make sense to include a specific version of external library in our svn and keep using it ignoring all updates?

    Read the article

  • How can I run my program on a large number of computers? [closed]

    - by zenpoy
    I'm looking for a (preferably free) service for running an executable I wrote? It's not malicious, it's not a virus, it's not scam, and if this is really important I can upload the python source code instead. I wrote a small crawler to gather information regarding the style of web pages for my MA project, and I need a lot more data. EDIT Here is more information on my problem and how I approach on solving it, and where I'm stuck. As part of my research I'm trying to classify text based on it's style (font-family for now), my data is based web pages, so I wrote a client/server application - the client is a crawler that gathers this data and send it to the server. The problem is that like 99% of the internet is Arial, Verdana and Helvetica - other fonts are far more rare, so I need to spend very long time to gather enough data regarding these fonts. Hope this explains it.

    Read the article

  • How to properly uninstall/reinstall Ubuntu One on Windows XP?

    - by user73303
    I had previously installed an Ubuntu One client as a test on a Windows XP machine. Now I wanted to change the account for the client to a production one but had problems changing the email address so decided to do a reinstall. Ran uninstall. Downloaded ubuntuone-3.0.2-windows-installer.exe. It downloads, goes through the unpacking/install – strangely some of the messages say updating as if it was replacing something that was already there. I do not get the setup/signin screen. There is no ubuntu% processes running. The Program files/ubuntuone directory exists with data and dist folders. The U icon is on the desktop – pointing at ubuntuone/dist/ubuntuone-control-panel-qt.exe but this does not run. Ran uninstall again, deleted Program files/ubuntuone directory, removed any ubuntu entries for registry, rebooted. Downloaded install again - exactly the same as above. How can I uninstall Ubuntu One to get a clean reinstall? Or force the install to continue after downloading/unpacking?

    Read the article

  • Different versions of 2.5" drives? Unscreawing Intel 320 for slimmer drive to fit newer thinkpads?

    - by hhh
    My x60s laptops come with about 1-2mm heigher 2.5" HDD than newer x220 models. This is totally stupid when one would like to reuse the old drive in the newer laptops. I bought newer 2.5" 320 Intel SSDs and they seem to have such 1-2mm gap to unscrew but I am unsure whether it is meant for opening. Could someone instruct what to do here? Look the manufacturer has started changing the old good 2.5" drives into slightly different versions, now it means slow compability issues to fix or totally new 2.5" drives. Ideas how to proceed? Unscrewing newer 320 drives or are some other versions of 2.5" drives meant for x220? Does there exist some sort of racks to get drives working between different comps? Perhaps booting from laplink is currently the easiest solution to get things working when changing harddrives between comps? Perhaps related http://www.zdnet.com/blog/hardware/seagate-announced-super-slim-25in-momentus-thin-hard-drive/6439

    Read the article

  • Is there a pressure sensitive stylus for windows 8 power capacitative screen devices?

    - by JohnnyM
    I own a Dell XPS 12 Duo (filp-screen ultrabook with 10-point cap. screen) with Windows 8. Note: The ultrabook has Bluetooth 4.0. I would really love to have a preassure sensitive stylus that I could use to draw on the capacitative screen, when in tablet mode. So far I couldn't find any that would be compatible with Windows 8, maybe you guys could help. Must have: works with capacitative screen windows 8 compatible pressure sensitive Important: thin tip Nice to have: palm rejection tilt sensitive extra buttons (lots of them) TIA

    Read the article

  • Outlook + Exchange 2007: it is possible to rid of local OST files?

    - by kdl
    I am looking for a solution which would allow to use a convenience of Outlook as a mail client app while at the same time have no PST or OST files on a local computer. Even in 'non-caching' mode Outlook creates an OST file where it downloads everything from the Exchange server. OWA does not create any local files (except cookies I believe) but lacks some of the nice features Outlook has. Would it be feasible to place OST files on a network share? Maybe the solution exists for some other client+server pair?

    Read the article

  • Directx vs XNA - Which is better for me? [closed]

    - by tristo
    Recently I got Visual Studio 2012 from visual studio 2010, although did not expect Visual Studio to 2012 to designed the way it was. Anyway I am pleased with some of VS 2012 technology and have moved all of my projects to it. At this point of time since I got VS 2012 I have been into making windows applications and other non-game activities. ALTHOUGH have recently gotten into the spirit of game development and I am planning to make a 3d comical game, shader effects, not too complicated meshes, but it requires alot of lighting effects to emphasise certain parts of the game. When I was using VS 2010 I had a great time making 2d games with XNA, it uses a great language, and has a very awesome system. But I no longer have XNA with me, and the workarounds described in stackoverflow always gives me errors while using xna. Anyway it seems that microsoft have stuffed themselves up with xna anyway with the weirdness of Windows 8, and it being only avaliabe on pc and xbox. Due to these reasons I have decided to work with Directx and Direct3d to produce my new game, although the overflowing credits after each directx game gives me the shivers, and the low-level coding of directx also puts me on thin ice with my games, left in a confusional mess with what decision I should make. I don't know anything about directx or direct3d. I am an indie developer, but I am planning to take on alot of professional aspects of games. I don't have heaps of time(2-3 hours a day) I don't mind the complexity of how directx works, as long as I can learn how to make the fundementals of a game in a week. I am also unsure if directx is really for my situation, and keep with xna game development. Anyone can tell me the best technology for me would be great.

    Read the article

  • Web hosting company basically forces me to use their domain name [closed]

    - by Jinx
    I've recently stumbled upon an unusual problem with one of hosting companies called giga-international.com. Anyway, I've ordered com.hr domain from Croatian domain name registration company, and my client insisted on using this host provider as couple of his friends already are hosted with them. I thought something was fishy when the first result on Google for Giga International was this little forum rant instead of their webpage. When I was checking their services they listed many features etc... space available, bandwidth etc. I just wanted to check how much ram do I get for my PHP scripts so I emailed them, and they told me that was company secret. Seriously? Anyway, since my client still insisted on hosting with them I've bought their Webspace package. During registration I had to choose free domain name because I couldn't advance registration without it. Nowhere was said, not even in general terms and conditions that I wouldn't be able to change that domain name. At least not for double the price of domain name per year. They said I can either move my domain name over to them (and pay them domain registration), or pay them 1 Euro per month for managing a DNS entry. On any previous hosting solution I was able to manage my domain names just by pointing my domain to their name servers, and this is something completely new and absurd for me. They also said that usual approach is not possible because of security and hardware limitations. I'd like to know what you guys think about this case, and should I report, and where should I report this case. In short. They forced me to register free domain name which doesn't suit my needs in order to register for their webspace package, and refuse to change domain name for my account until I either transfer domain to them or pay them DNS management which costs double the price of the domain name per year.

    Read the article

  • After installing Office365 can you go back to Office 2008 (without the CD)

    - by Ryan
    I got this laptop from my dad and don't have the Microsoft Office 2008 CD which is what he had installed when he gave it to me to use. Now I've got a client that wants me to do some freelance work and sent me to Microsoft Exchange and the first thing it wants me to do in the Exchange is install Office365. The client mentioned very briefly that he would get me the software if necessary but he wasn't specific about what software. Now that I see it my concern is after the job is done I'll be left with a monthly bill to have Office. Will it be possible to go back to Office 2008 without having the CD?

    Read the article

  • Why does this service refuse to start on Windows server 2003?

    - by PenguinCoder
    We have a Windows 2003 server with Cebos MQ1 (ver. 7 and ver. GRI) products installed that have been operational for years. After installing Microsoft 2010 C++ Redistributable package needed for other development, the MQ1 GRI service now fails to start. Event logs showed that two additional updates (.NET4 and the 2010 C++ Redistributable SP2) where installed by the redistributable as well. As soon as we discovered the MQ1 service was not starting properly, we removed these three installed packages. However the service still does not start; the dialog that pops up states 'The service started then stopped. '. Event logs when we attempt to start the service show nothing; IE: No errors, crashes, failures, or other information related to this service. Executing the MQ1Serv.exe directly specifies an issue of 'Missing command line operation, must specify install, uninstall and company abbreviation.' sc query MQ1Service(GRI) shows a clean exit for the Win32ExitCode of 0x0. Attempting to reinstall the client or server software gives an error of 'The procedure entry point ReInitializeCriticalSection could not be located in the dynamic link library KERNEL32.dll.' at the 'Registering Libraries' stage. At this point, further research has stated that the required function is in URL.dll and to verify the library is not corrupted. Running an sfc /scannow on the server has replaced a few DLLS; including the URL.DLL to versions from 2005. This actually broke other applications which required a reinstall (one of them being IE 7). After reinstall and updates, url.dll version is 7.0.5730.13 (2009) and Kernel32.dll is version 5.2.3790.4480 (2009). The MQ1 GRI service still will not start, specifying the same error as previous 'Service started then stopped'. Running a disassembler on Kernel32.dll and Url.dll show no functions named ReinitializeCriticalSection. Attempting the reinstall of the MQ1 client and server as well as starting the service again, fails once more. However, setting the compatibility mode on the MQ1 client install exe to 'Windows 95' actually gets the program to install. Setting the compatibility mode on the MQ1 server service does not enable it to start. I have been researching this problem for nearly a week and besides the advice to scan and replace url.dll, have come to no successful conclusions. This service was operational prior to the 2010 C++ install, without any additional parameters or settings. After removing the C++ install and all servicepacks/updates it installed silently, still does not correct the issue of the MQ1 GRI service not starting. Q: Has anyone else run into this or similar issue while attempting to get a service initialized? What have I overlooked or what else can I try in order to get this service started??

    Read the article

  • Unable to access, make directories (and files) with ftp

    - by Kriem
    I'm having trouble with my new server and accessing its directories. I updated my proftpd.conf with: DefaultRoot / No I'm able to see the root directory of my server. But, trying to access some directories gives different results. For example, I can access /vars but I can't access /home or /root How can I overcome this? This is what my ftp client says after trying to access /root: Server said: /root: No such file or directory Error -125: remote chdir failed This is what my ftp client says after trying to create a new directory in /: Server said: untitled folder: Permission denied Error -140: remote mkdir failed

    Read the article

  • Starting VMs with an executable with as low overhead as possible

    - by Robert Koritnik
    Is there a solution to create a virtual machine and start it by having an executable file, that will start the machine? If possible to start as quickly as possible. Strange situation? Not at all. Read on... Real life scenario Since we can't have domain controller on a non-server OS it would be nice to have domain controller in an as thin as possible machine (possibly Samba or similar because we'd like to make it startup as quickly as possible - in a matter of a few seconds) packed in a single executable. We could then configure our non-server OS to run the executable when it starts and before user logs in. This would make it possible to login into a domain.

    Read the article

  • How do I use a URL path instead of a file path in an Open File dialog in Mac OSX or ChromiumOS?

    - by Chris
    In Windows 7 (and perhaps earlier), the default "Open File" dialog box allows you to type a full URL into the "File name" section as if it were a file path, e.g. "http://www.example.com/pic.gif" instead of "C:/windows/pictures/pic.gif". When uploading a file to a website on the client side - say, an image - this allows the client to upload a picture located on a server accessible via the URL instead of downloading the image, saving it locally, then referencing the local image in the "Open File" dialog. It's a great option for Windows users. I have three separate questions: What is this procedure formally called? How do I describe this succinctly so that my searches for more information are fruitful? Can something similar be done in Mac OSX, Chromium OS, or a Linux environment? If so, how? Thanks!

    Read the article

  • Slow WLAN file transfer between server and tablet

    - by user266985
    My file server is running Ubuntu 12.04 and I'm sharing files from it over samba. It is connected via gigabit ethernet. My desktop, running Windows 8.1, is also connected via gigabit ethernet. I can transfer files between the two and completely saturate that gigabit pipe. However, I just got a Surface Pro 2, and I'm trying to stream HD movies from my server to the device over WiFi. For some reason, I can't break much past 1.5MB/s transferring files over the network. I've tried streaming through XBMC and a standard file copy; no difference. To add the confusion, if I connect to my guest network and then use my VPN server (installed on the router) to access the file server, I get around 3.2MB/s. I've been running diagnostics to determine the root and I think I've found it but I have no idea what is causing it or how to fix it. Router: Asus RT-N66U Surface Pro 2 Network Card: Marvell Avastar 350N (Driver 19/09/2013 v14.69.24044.150) InSSIDer: Link Score: 100 Co-Channels: 0 Overlapping: 0 5GHz Network Channel: 48+44 iperf File Server as Server; Surface Pro 2 as Client - TCP Performance: Acceptable ------------------------------------------------------------ Server listening on TCP port 5001 TCP window size: 85.3 KByte (default) ------------------------------------------------------------ [ 4] local 192.168.0.90 port 5001 connected with 192.168.0.56 port 57367 [ ID] Interval Transfer Bandwidth [ 4] 0.0- 1.0 sec 10.1 MBytes 84.7 Mbits/sec [ 4] 1.0- 2.0 sec 10.4 MBytes 87.6 Mbits/sec [ 4] 2.0- 3.0 sec 10.6 MBytes 88.8 Mbits/sec [ 4] 3.0- 4.0 sec 10.7 MBytes 89.5 Mbits/sec [ 4] 4.0- 5.0 sec 10.1 MBytes 84.4 Mbits/sec [ 4] 5.0- 6.0 sec 10.2 MBytes 85.8 Mbits/sec [ 4] 6.0- 7.0 sec 7.04 MBytes 59.1 Mbits/sec [ 4] 7.0- 8.0 sec 10.8 MBytes 90.2 Mbits/sec [ 4] 8.0- 9.0 sec 10.6 MBytes 89.1 Mbits/sec [ 4] 9.0-10.0 sec 8.62 MBytes 72.3 Mbits/sec [ 4] 0.0-10.0 sec 99.2 MBytes 83.1 Mbits/sec iperf Surface Pro 2 as Server, File Server as Client Performance: Poor ------------------------------------------------------------ Client connecting to 192.168.0.56, TCP port 5001 TCP window size: 22.9 KByte (default) ------------------------------------------------------------ [ 3] local 192.168.0.90 port 40233 connected with 192.168.0.56 port 5001 [ ID] Interval Transfer Bandwidth [ 3] 0.0- 1.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 1.0- 2.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 2.0- 3.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 3.0- 4.0 sec 1.25 MBytes 10.5 Mbits/sec [ 3] 4.0- 5.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 5.0- 6.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 6.0- 7.0 sec 1.38 MBytes 11.5 Mbits/sec [ 3] 7.0- 8.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 8.0- 9.0 sec 1.50 MBytes 12.6 Mbits/sec [ 3] 9.0-10.0 sec 1.62 MBytes 13.6 Mbits/sec [ 3] 0.0-10.1 sec 15.0 MBytes 12.4 Mbits/sec For some reason, it gets capped and I haven't got a clue why. Any suggestions? Edit: My link speed is reported as 270Mbps by Windows. I'm less than two metres from the router with a clear line of sight.

    Read the article

  • Auto-starting a GUI application that requires sudo

    - by nanostuff
    Question: I need to auto-start a GUI application that requires sudo. I know I need to edit the sudoers file with: sudo visudo However, I don't know what to write in the file. What I already tried: sudo visudo and then added the following: nanostuff ALL = NOPASSWD: /usr/lib/AirVPN/AirVPN.exe I also tried with: nanostuff ALL = NOPASSWD /usr/bin/X11/airvpn and nanostuff ALL = NOPASSWD /usr/bin/airvpn None of those worked. By doing: ps aux | grep airvpn I get the following output: nanostuff 6805 0.2 0.4 483520 17384 ? Sl 17:13 0:01 /usr/bin/gksu -u root -m AirVPN Client needs administrative privileges. Please enter your password. mono /usr/lib/AirVPN/AirVPN.exe path=/home/nanostuff/.airvpn root 6806 0.0 0.0 78604 2392 ? Ss 17:13 0:00 /usr/bin/sudo -H -S -p GNOME_SUDO_PASS -u root -- mono /usr/lib/AirVPN/AirVPN.exe path=/home/nanostuff/.airvpn root 6808 3.2 2.0 1257532 83032 ? Sl 17:13 0:12 mono /usr/lib/AirVPN/AirVPN.exe path=/home/nanostuff/.airvpn root 6832 0.0 0.0 22652 3336 ? S 17:14 0:00 /usr/sbin/openvpn --config /home/nanostuff/.airvpn/384ef91f85df5ea2abc88c7416b95bbdf2bc4299edd2850614d4e343ba721ae3.tmp.ovpn nanostuff 6951 0.0 0.0 18932 932 pts/2 S+ 17:20 0:00 grep --color=auto airvpn Additional info: OS: Ubuntu 14.04 64bits Application: It's a VPN client

    Read the article

  • Good links somehow being converted to ones with a PHP redirect (not a virus)

    - by Rebecca
    This has happened to links we put on web pages and in emails. We might put www.oursite.org/work/ but when I view source it shows up as webmail.ourhosting.ca/hwebmail/services/go.php?url=https%3A%2F%2Fwww.oursite.org%2F%2work%2F This ends up at the webmail login page for our web host. But only some of the people who click the link get the login page; others go directly to the original page we intended. We don't want it to go to the webmail login page, nobody needs to log in to our web site. This occurs for links to pages on our site, but also to links to other sites that we put in emails or in posts. It seems to be browser independent as well as e-mail client independent as we variously have used Firefox and Chrome as well as MS Outlook and Thunderbird. I've tried to resolve the issue with our webhost but they keep telling me they don't support our browser, or our email client (i.e., they don't understand the issue). At the moment, our only option is to try another web host just to get rid of their login. Any ideas about what's going on?

    Read the article

  • Apache certificates for some urls not working

    - by Vegaasen
    We are having a rather strange problem with a Apache-installation. Here is a short summary: Currently I'm setting up Apache with https, and server-certificates. This is fairly easy and works straight out of the box - as expected. This is the configuration for this setup: Listen 443 SSLEngine on SSLCertificateFile "/progs/apache/ssl/example-site.no.pem" SSLCertificateKeyFile "/progs/apache/ssl/example-site.no.key" SSLCACertificateFile "/progs/apache/ssl/ca/example_root.pem" SSLCADNRequestFile "/progs/apache/ssl/ca/example_intermediate.pem" SSLVerifyClient none SSLVerifyDepth 3 SSLOptions +StdEnvVars +ExportCertData RequestHeader set ssl-ClientCert-Subject-CN "%{SSL_CLIENT_S_DN}s" RewriteEngine On ProxyPreserveHost On ProxyRequests On SSLProxyEngine On ... <LocationMatch /secureStuff/$> SSLVerifyClient require Order deny,allow Allow from All </LocationMatch> ... <Proxy balancer://exBalancer> Header add Set-Cookie "EX_ROUTE=EB.%{BALANCER_WORKER_ROUTE}e; path=/" env=BALANCER_ROUTE_CHANGED BalancerMember http://10.0.0.1:7200 route=ee1 retry=300 flushpackets=off keepalive=on BalancerMember http://10.0.0.2:7200 route=ee2 retry=300 flushpackets=off keepalive=on status=+H ProxySet stickysession=EX_ROUTE scolonpathdelim=Off timeout=10 nofailover=off failonstatus=505 maxattempts=1 lbmethod=bybusyness Order deny,allow Allow from all </Proxy> RewriteCond %{REQUEST_URI} !^/index.html [NC] RewriteRule ^/(.*)$ balancer://exBalancer/$1 [P,NC] ProxyPassReverse / balancer://exBalancer/ Header edit Set-Cookie "(.*)" "$1;HttpsOnly" ... So - everything works fine and as expected for all of the pages that are not a part of the LocationMatch-directive. When requesting something that matches the LocationMatch-directive, I'm asked for a certificate (hence the SSLVerifyClient required attribute) - and getting all the correct certificates in my browser that is based on the root/intermediate chain. After choosing a certificate and clicking "OK", this is what pops up in the apache logs: [ssl:info] [pid 9530:tid 25] [client :43357] AH01998: Connection closed to child 86 with abortive shutdown ( [Thu Oct 11 09:27:36.221876 2012] [ssl:debug] [pid 9530:tid 25] ssl_engine_io.c(1171): (70014)End of file found: [client 10.235.128.55:45846] AH02007: SSL handshake interrupted by system [Hint: Stop button pressed in browser?!] And this just spams the logs. What is happening here? I can see this configuration working on my local machine, but not on one of our servers. There is no configration differences between the servers, only minor application-wise-changes. I've tried the following: 1) Removing CA-certificate-checking (works) 2) Adding required CA-certificate for the whole site (works) 3) Adding "SSLVerifyClient optional" does not work 4) ++ Server/Application Information Local: -OpenSSL v.1.0.1x -Apache 2.4.3 -Ubuntu -mpm: event -every configuration should be turned on (failing) server: -OpenSSL 0.9.8e -Apache 2.4.2 -SunOS -mpm: worker -every configuration should be turned on Please let me know if more information is needed, I'll provide it instantly. Brief sum-up: -Running apache 2.4 -Server certificates works just fine -Client certificates for some /Locations does not work, fails with errors PS: Could it be related with the OpenSSL version and the "Renegotiation" stuff related to TLS/SSLv3?

    Read the article

  • How to make sure clients update their browser cache when my website is updated?

    - by user64204
    I am using the HTTP 1.1 Cache-Control header to implement client-side caching. Since I update my website only once a month I would like the CSS and JS files to be cached for 30 days with Cache-Control: max-age=2592000. The problem is that the 30-day period defined by Cache-Control doesn't coincide with the website update cycle, it starts from the moment the users visit the site and ends 30 days later, which means an update could occur in the meantime and users would be running with outdated content for a while, which could break the rendering of the website if for instance the HTML and CSS no longer match. How can I perform client-side caching of content for periods of several days but somehow get users to refresh their CSS/JS files after the website has been updated? One solution I could think of is that if website updates can be schedule, the max-age returned by the server could be decreased every day accordingly so that no matter when people visit the website, the end of caching period would coincide with the update of the website, but changing the server configuration every day goes against one of my sysadmin principles (once it's running, don't touch it).

    Read the article

  • PuTTY automatically supply password

    - by Kyle Cronin
    I have a situation where I need to have PuTTY (or another SSH client for Windows) automatically log into another machine via SSH. I realize that this isn't a good idea security-wise, but unfortunately I'm constrained by the limitations both on the client and the server. The best solution would be to have a shortcut or script on the desktop that, when double clicked, will connect to the server and automatically log in. Can I do this with PuTTY? I am willing to explore public key authentication, but I'm not sure where the PuTTY key resides or how to copy it to the server, as the app starts automatically upon login.

    Read the article

  • How can I unify my email, calendar and tasks (2 exchange accounts + 1 gmail)

    - by Assaf Stone
    This is my situation: I work as a consultant, and thus work out of multiple computers: my work-laptop a desktop at my primary client my desktop at home an android smartphone an android tablet Likewise, I have multiple accounts: A Microsoft Exchange (2010 AFAIK) account A Microsoft Exchange (2007 AFAIK) account A gmail account The most important thing I need is the ability to have events in one calendar affect the free / busy status of all other accounts (so that if I am busy on Monday 9am with an event from my employer's account, it will show that time as busy in my client's account, and in the gmail account. Second thing I need is a unified view of all of my accounts' info: Appointments, email, tasks, and contacts (in that order of importance). I've already tried outlook synchronization tools such as gSyncit, to sync both exchange accounts with gmail, but this creates a mess when updating appointments (deleted appointments sometimes return, timestamps revert). Is there perhaps some way to at least synchronize the free/busy state in a way that all of my calendar apps / accounts will look there to see if I can be invited? Just solving that would be well worth my while. Thanks, Assaf

    Read the article

  • Remote Desktop Encryption

    - by Kumar
    My client is RDP 6.1 (On Windows XP SP3) and Server is Windows Server 2003. I have installed an SSL certificate on server for RDP. In the RDP settings (General tab), the Encryption method is set to SSL/TLS 1.0 and Encryption level is set to "Client Compatible". I have following questions In this case is it guaranteed that all communication is encrypted even when I remote login to the server? I mean pwd is encrypted Does RDP always use some kind of encryption even if there is no SSL certificate installed on the server? In this case I do not see security lock in the connection bar. When I set encryption level to "High" then I see security lock. I do believe that communication is both cases will be encrypted. Is it true? Please reply to my questions Thanks in advance Kumar

    Read the article

< Previous Page | 298 299 300 301 302 303 304 305 306 307 308 309  | Next Page >