Search Results

Search found 37772 results on 1511 pages for 'browser based games'.

Page 559/1511 | < Previous Page | 555 556 557 558 559 560 561 562 563 564 565 566  | Next Page >

  • Show floor-plans online like a map

    - by Quora Feans
    Given a floor-plan, which is too big for any screen, even if it is a 17" one, how can I show it online like a map? It would need further functionality that a browser alone does not have (just zoom in/out the entire image won't do the trick). The image will be breaked down into smaller jpgs, so the user will not have to download the whole floorplan at once.It will need some zoom in/zoom out button, and some way or bookmarking position (x,y). open-source solutions prefered.

    Read the article

  • Emails sent from Coldfusion using the same SMTP/Exchange server works from one machine but fails for another

    - by Peter Herdenborg
    First, apologies if this question is too vague or has too little information to really be answerable. I am not normally working with these issues, and I don't have full access to the environment. However, the hosting provider seems to have a hard time tracking down the issue, so I am hoping that someone can at least provide me with some qualified guesses about the most likely problem. Here goes: A client I work for has a hosted IT environment, based on virtual machines running Windows 2008 R2 Standard. Our website, based on Coldfusion 9 was recently migrated from one virtual machine to another, and though Coldfusion is configured in the exact same way, using the same SMTP server, i.e. the client's Exchange server hosted in the same environment and in the same AD as both web servers, sending emails to external recipients is no longer working. It is still working fine when testing from the old machine. This is what I've learnt so far (all emails are sent using a valid from-address on the client's domain): Emails sent to other recipients on the same domain are delivered without any problem. Emails sent to external recipients on other domains are never delivered. When sending emails to both internal and external recipients, no emails are delivered. When receiving one of these emails to an internal address, the sender is now indicated as "[email protected]", while when sent from the old machine, it used to say just "sender". This seems to me that it could hint that the Exchange machine "recognizes" the old web server while it is a stranger to the new. In Coldfusion's mail log, all messages appear to be successfully delivered to the SMTP server. Any ideas what settings to look at, what log entries to search for or how to compare the old web server with the new one will be highly appreciated.

    Read the article

  • Apache certificates for some urls not working

    - by Vegaasen
    We are having a rather strange problem with a Apache-installation. Here is a short summary: Currently I'm setting up Apache with https, and server-certificates. This is fairly easy and works straight out of the box - as expected. This is the configuration for this setup: Listen 443 SSLEngine on SSLCertificateFile "/progs/apache/ssl/example-site.no.pem" SSLCertificateKeyFile "/progs/apache/ssl/example-site.no.key" SSLCACertificateFile "/progs/apache/ssl/ca/example_root.pem" SSLCADNRequestFile "/progs/apache/ssl/ca/example_intermediate.pem" SSLVerifyClient none SSLVerifyDepth 3 SSLOptions +StdEnvVars +ExportCertData RequestHeader set ssl-ClientCert-Subject-CN "%{SSL_CLIENT_S_DN}s" RewriteEngine On ProxyPreserveHost On ProxyRequests On SSLProxyEngine On ... <LocationMatch /secureStuff/$> SSLVerifyClient require Order deny,allow Allow from All </LocationMatch> ... <Proxy balancer://exBalancer> Header add Set-Cookie "EX_ROUTE=EB.%{BALANCER_WORKER_ROUTE}e; path=/" env=BALANCER_ROUTE_CHANGED BalancerMember http://10.0.0.1:7200 route=ee1 retry=300 flushpackets=off keepalive=on BalancerMember http://10.0.0.2:7200 route=ee2 retry=300 flushpackets=off keepalive=on status=+H ProxySet stickysession=EX_ROUTE scolonpathdelim=Off timeout=10 nofailover=off failonstatus=505 maxattempts=1 lbmethod=bybusyness Order deny,allow Allow from all </Proxy> RewriteCond %{REQUEST_URI} !^/index.html [NC] RewriteRule ^/(.*)$ balancer://exBalancer/$1 [P,NC] ProxyPassReverse / balancer://exBalancer/ Header edit Set-Cookie "(.*)" "$1;HttpsOnly" ... So - everything works fine and as expected for all of the pages that are not a part of the LocationMatch-directive. When requesting something that matches the LocationMatch-directive, I'm asked for a certificate (hence the SSLVerifyClient required attribute) - and getting all the correct certificates in my browser that is based on the root/intermediate chain. After choosing a certificate and clicking "OK", this is what pops up in the apache logs: [ssl:info] [pid 9530:tid 25] [client :43357] AH01998: Connection closed to child 86 with abortive shutdown ( [Thu Oct 11 09:27:36.221876 2012] [ssl:debug] [pid 9530:tid 25] ssl_engine_io.c(1171): (70014)End of file found: [client 10.235.128.55:45846] AH02007: SSL handshake interrupted by system [Hint: Stop button pressed in browser?!] And this just spams the logs. What is happening here? I can see this configuration working on my local machine, but not on one of our servers. There is no configration differences between the servers, only minor application-wise-changes. I've tried the following: 1) Removing CA-certificate-checking (works) 2) Adding required CA-certificate for the whole site (works) 3) Adding "SSLVerifyClient optional" does not work 4) ++ Server/Application Information Local: -OpenSSL v.1.0.1x -Apache 2.4.3 -Ubuntu -mpm: event -every configuration should be turned on (failing) server: -OpenSSL 0.9.8e -Apache 2.4.2 -SunOS -mpm: worker -every configuration should be turned on Please let me know if more information is needed, I'll provide it instantly. Brief sum-up: -Running apache 2.4 -Server certificates works just fine -Client certificates for some /Locations does not work, fails with errors PS: Could it be related with the OpenSSL version and the "Renegotiation" stuff related to TLS/SSLv3?

    Read the article

  • apt-get install not working in script

    - by isoman
    I create a small script that installs a set of linux paquets . Strangely apt-get install always fails and tells me that the package have not been found. Here is my script: #! /bin/bash sudo apt-get install python-software-properties sudo apt-get update sudo add-apt-repository ppa:pitti/postgresql sudo apt-get install xfce4 postgresql-9.0 pgadmin3 chromium-browser wine iftop What can i do to fix this ? Thanks .

    Read the article

  • cpu usage nearly 100 constantly windows xp sp3

    - by user23954
    When running some heavy applications like games or virtual box for some time.. the cpu usage is normal for some 15 mins and then suddently cpu usage increases. even after i quit the heavy apps and when i start some other apps, the cpu usage of the new opened application is also very high.. this continues until i reboot the system. There is single particular process occupy more cpu. All the processes cpu usage is little high than normal.. Any solutions?

    Read the article

  • CentOS server. What does it mean when the total used RAM does not equal the sum of RES?

    - by Michael Green
    I'm having a problem with a virtual hosted server running CentOS. In the past month a process (java based) that had been running fine started having problems getting memory when the JVM was started. One strange thing I've noticed is that when I start the process, the PID says it is using 470mb of RAM while the 'used' memory immediately drops by over a 1GB. If I run 'top', the total RES used across all processes falls short of the 'used' listed at the top by almost 700mb. The support person says this means I have a memory leak with my process. I don't know what to believe because I would expect a memory leak to simply waste the memory the process is allocated not to consume additional memory that doesn't show up using 'top'. I'm a developer and not a server guy so I'm appealing to the experts. To me, if the total RES memory doesn't add up to the total 'used' it indicates that something is wrong with my virtual server set-up. Would you also suspect a memory leaking java process in this case? If I use free before: total used free shared buffers cached Mem: 2097152 149264 1947888 0 0 0 -/+ buffers/cache: 149264 1947888 Swap: 0 0 0 free after: total used free shared buffers cached Mem: 2097152 1094116 1003036 0 0 0 -/+ buffers/cache: 1094116 1003036 Swap: 0 0 0 So it looks as though the process is using (or causing to be used) nearly 1GB of RAM. Since the process (based on top is only using 452mb, does that mean that the kernal is all of a sudden using an additional 500mb?

    Read the article

  • How do I install Visual Studio 2010 Express somewhere besides C:?

    - by TwentyMiles
    I have a SSD as my primary (C:) drive, mainly used for quickly loading games. It's pretty small (~30 GB) so I want to keep things that don't really need a speed boost off of it. I attempted installing the Visual Studio 2010 Express beta last night, and It claimed to require 2.1 GB of space so I changed the install directory to a secondary, non-SSD drive. After this, the installer said that it would use 1.8 GB on C: and ~200 MB on the secondary drive. While this token gesture of moving 1/10 of the app to the place I told it to is cute, I really want to install everything I can to the secondary drive. Is there any way to install all of Visual Studio 2010 Express to a drive besides C:?

    Read the article

  • Is there a way to see what shutdown the computer?

    - by Celeritas
    I had many programs opened and was typing a message in my web browser and suddenly a window asking me something popped up. I think I was in the middle of typing the word "for" but whatever button I hit seemed to be the confirmation to shutdown the computer. Is there a way to find which program caused this and prevent it in the future? I have a hunch it was JDownloaders fault. I'm using Windows 7.

    Read the article

  • How do I view the location of an swf file that is obfuscated somehow.

    - by atticus
    Specifically, I'm trying to view Elmo's Keyboard-o-rama fullscreen. The original swf file has been moved and obscured. For a toddler, this game really needs to be full screen! The toddler doesn't mind too much and has already lost interest in the game for the day. But it's just driving me crazy. I've tried the usual method of viewing the page info in Firefox to no avail. And before people start trying to delete this for being game specific, I would like to know how to do this for any obfuscated swf location, not just games. Thanks in advance. If anybody knows how to find the appropriate information in tcpdump or wireshark, that could probably help, too. That's what I'm trying to do right now.

    Read the article

  • Take a screenshot of an entire webpage in Opera

    - by robertc
    Is there some tool within Opera, or possibly an add-on, which will let me take a screenshot of an entire web page? I usually use Screengrab to do this with Firefox, but in this situation I want a screenshot of the page as Opera renders it (because I want to show the page as rendered with HTML5 form controls like date and time). I am currently using Opera 10.60 x86_64 on Fedora 12, so solutions that work in browser would be preferable rather than external programs.

    Read the article

  • Xen or KVM? Please help me decide and implement the one which is better

    - by JohnAdams
    I have been doing research for implementing virtualization for a server running 3 guests - two linux based and one windows. After trying my hands on Xenserver, I am impressed with the architecture and wanted to use the opensource XEN, which is when I am hearing a lot more about KVM, about how good it is and it's the future etc. So, could anyone here please help me answer some of my queries, between KVM and XEN. Based on my requirement of three VMs on one server, which is better for performance - KVM or XEN, considering one the linux vm's will works a file-server, one as a mailserver and the third one a Windows server? Is KVM stable? What about upgrades.. What about XEN, I cannot find support for it Ubuntu? Are there any published benchmarks on both Xen and KVM? I cannot seem to find any. If I go with Xen, will it possible to move to KVM later or vice versa? In summary, I am looking for real answers on which one I should use.. Xen or KVM?

    Read the article

  • RV042 - how to broadcast UDP through VPN

    - by user47221
    I setup a gateway to gateway vpn connection with linksys RV042 router. i was able to ping each other, access file sharing ( with netbios enabled ). They are having same subnet mask 255.255.255.0, 192.168.1.0 <--- 192.168.2.0 Firewall disabled. But when i created/host LAN games (eg. warcraft3), it cannot be detect by clients at another site/LAN. As i know warcraft3 is using UDP broadcast to tell the client the game is creaetd. 1. How to broadcast UDP to another LAN ? any things to do with multicast ?

    Read the article

  • Setting up a subdomain on IIS7

    - by EvanGWatkins
    I have a dev server for my C# Web Application and to access the dev site I go to the server name in my browser (lfi-fsxmv06) and I can access my web application. Now I want to set up a subdomain of that (test.lfi-fsxmv06) is this possible? My bindings on the dev site (lfi-fsxmv06) are http with port 80 and ip address *, and the hostname is blank. My bindings on the subdomain site are http, port 80, IP adddress *, and the hostname is test.lfi-fsxmv06 however if I try t

    Read the article

  • Setting JAVA_HOME on Ubuntu 10.x

    - by user20285
    I'm trying to get the Rhodes framework installed so I can develop Android apps. This requires that I install the SUN JDK and add JAVA_HOME and JAVA_HOME/bin to path. I thought I could solve this by editing my bash.bashrc file: JAVA_HOME="/usr/lib/jvm/java-6-sun/jre/bin/java" export JAVA_HOME PATH=$PATH:$JAVA_HOME/bin This still doesn't work, because when I run: rake run:android I get a prompt in the console that says the Java bin was not found in my path. However, running echo $PATH gets me: usernamee@ubuntu:~$ echo $PATH /usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/home/username/ruby/gems/bin:/usr/lib/jvm/java-6-sun/jre/bin/java/bin:/home/username/ruby_files/android-sdk-linux_86/tools What are my options here? Edit: If the problem is not the export statement, how can I ensure that the Sun JDK is properly installed and that I am, in fact, pointing to the correct path in bashrc?

    Read the article

  • How much free memory should I have on my webserver?

    - by neanderslob
    I have a webserver that's currently hosting two Wordpress sites and some java-based collaboration software. The server has 2G of memory and is currently using about 1.8G of the available memory. Right now what's on here is pretty much a pilot project that's getting negligible traffic so I think it's pretty clear that I'll be needing more memory. I was wondering, if I was to release it, how I might anticipate my memory needs based on the traffic it gets. I've poked around on Google and what I've found has been a bit tenuous. Is there a good heuristic that one should use when calculating memory demands as a function of the base (no traffic) load on the server? For reference, the output of free -m can be seen below: total used free shared buffers cached Mem: 2048 1832 215 0 0 0 -/+ buffers/cache: 1832 215 Swap: 0 0 0 To me this looks like actual memory used and isn't an illusion due to caching or anything else. I figure the demands of my collaboration software will have to be experimentally tested so here's free -m without that software running: total used free shared buffers cached Mem: 2048 1109 938 0 0 0 -/+ buffers/cache: 1109 938 Swap: 0 0 0 My plan B to figure this out is to add a bunch of swap space to the server, give it some traffic and adjust according the the amount that swap gets used. I was just wondering if anyone had a good rule of thumb to estimate how much memory I should plan on in advance...or if what I'm thinking is nuts. Many thanks in advance (I'm really quite new to this).

    Read the article

  • How can I have a Windows 7 VMware guest without mouse support?

    - by Matthew Read
    Despite having vmmouse.present = "FALSE" mouse.vusb.absDisabled = "TRUE" pref.motionUngrab = "FALSE" in my .vmx file and a customized VMware Tools installation on the guest that does not include the mouse driver, I somehow have fully integrated mouse support for my Windows 7 VM. I can smoothly mouse from the host into the guest without needing to click or Ctrl+G in and Ctrl+Alt out. I don't want this because of the issues it causes with games. How can I get the VM to have no special mouse support while still having VMware Tools installed for its other functions (network, graphics, etc.)? The mouse works as I want without VMware Tools but not otherwise — again, despite not installing the mouse driver and having all those settings trying to disable it. Device Manager shows that the generic Windows PS/2 mouse driver is being used and not the virtual mouse driver. Guest and host are both Windows 7 Ultimate SP1, x86 and x64 respectively. I'm using VMware Player 3.1.4 and the VMware Tools installed is the latest, 8.4.6.16648.

    Read the article

  • Ubuntu 9.10 upgraded with Internet connection fail

    - by Neofizz
    I am pretty sure that the latest Ubuntu release does not support my wireless internet card. Everything was working fine in 9.04 and since upgrading my browser will load web pages. I can see and connect to my wep encrypted network. I can ping google.com and lose no packets. I am at the end of my tether, what else can I try? Is it possible to download a driver to re-enable my wireless?

    Read the article

  • What is the risk of introducing non standard image machines to a corporate environment

    - by Troy Hunt
    I’m after some feedback from those in the managed desktop or network security space on the risks of introducing machines that are not built on a standard desktop image into a large corporate environment. This particular context relates to the standard corporate image (32 bit Win XP) in a large multi-national not being suitable for a particular segment of users. In short, I’m looking at what hurdles we might come across by proposing the introduction of machines which are built and maintained by a handful of software developers and not based on the corporate desktop image (proposing 64 bit Win 7). I suspect the barriers are primarily around virus definition updates, the rollout of service packs and patches and the compatibility of existing applications with the newer OS. In terms of viruses and software updates, if machines were using common virus protection software with automated updates and using Windows Update for service packs and patches, is there still a viable risk to the corporate environment? For that matter, are large corporate environments normally vulnerable to the introduction of a machine not based on a standard image? I’m trying to get my head around how real the risk of infection and other adverse events are from machines being plugged into the network. There are multiple scenarios outside of just the example above where this might happen (i.e. a vendor plugging in a machine for internet access during a presentation). Would a large corporate network normally be sufficiently hardened against such innocuous activity? I appreciate the theory as to why policies such as standard desktop images exist, I’m just interested in the actual, practical risk and how much a network should be protected by means other than what is managed on individual PCs.

    Read the article

  • 3 Monitors on a Notebook

    - by Rihan Meij
    I would like to use 3 screens on my Dell Inspiron 1720 So On the laptop built in screen have that as one, and then have 2 more screens. The catch is, that I want to play racing games with this set-up. So that my main screen is the focus area (the front window if you will) and the other 2 screens will be used for peripheral vision, on the side. The software that I use (LFS.net) does support multiple screens. However the notebook can have the main screen on, and another external screen. So I would need to split this "second" monitor output, to 2 screens, the one to the left of the main monitor, and the other one to the right. Is this possible? Is there perhaps a external card / docking station solution that could help with this? Any advice or ideas is greatly appreciated. Best Regards Rihan

    Read the article

  • building a debian base image

    - by Michael
    Is there a preferred way to create base images for Debian-based customized installations? We are currently going with multistrap but although it's better than hand-crafted chroot stuff, it still has a lot of edges and corners. Is there a more reliable and less error-prone way to produce a root filesystem of a Debian installation with some additional .debs installed? (I don't want to send out a Debian installer with a preseed file though.) Addendum 1: To clarify things a bit: We are delivering some kind of software appliance to our customers. That is, a debian operating system, with some additional software packages -- both our own and third-party ones -- and some configuration changes. To ease the installation process, we have an installer that does nothing more than partitioning, copying files to the partitions and setting up grub. So it's basically an image-based installer. So we are basically running the debian installation ourselves and just distribute the already installed operating system. The question is about the installation part. I want to have that as easy and robust as possible, and of course, it should be an automated process.

    Read the article

  • how to restore windows 7 to a know working state every time it boots

    - by Artanis
    A couple of days ago my mother asked me to set up a computer at his house, she wants to use it to basic web browsing, video chat and nothing more. The problem is, neither my mother nor my sister know anything about using or maintaining a computer. What I want is to have a working base install of windows 7 and just discard everything installed, downloaded, ... when it reboots. That way I can set up a partition just for saving files and whatever they do the computer will always return to a working state at start up. can that be done? PS: Sadly linux is not an option since my sister wants to be able to play some games with my steam account and not all of them run with wine.

    Read the article

< Previous Page | 555 556 557 558 559 560 561 562 563 564 565 566  | Next Page >