Search Results

Search found 18800 results on 752 pages for 'sqlauthority website revi'.

Page 96/752 | < Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >

  • How to set up TightVNC Java viewer index.html on web server?

    - by penyuan
    I've got the Java TightVNC viewer applet set up with the provided index.html on my Mac OS X 10.6.3 with web sharing enabled. Using a remote computer I was able to get to the webpage but I only see a white box with an X (for error?) that represents where the viewer is supposed to be. Any ideas on how to get this to work? I've tried to set the port (in index.html) to 5900 and 5901, none worked. Are any of these the default VNC port for Mac OS X 10.6.3? Also, I've activated Screen Sharing and Remote Login in System Preferences, allowing VNC viewers to connect. Here is the code for my index.html: <HTML> <TITLE> TightVNC desktop </TITLE> <APPLET CODE="classes/VncViewer.class" ARCHIVE="classes/VncViewer.jar" WIDTH="1440" HEIGHT="900"> <PARAM NAME="PORT" VALUE="5900"> <PARAM NAME="Scaling factor" VALUE="50"> </APPLET> <BR> <A href="http://www.tightvnc.com/">TightVNC site</A> </HTML> Again I can get to this page, but the applet doesn't seem to work, the Java console also doesn't say anything. Thanks in advance for your help!

    Read the article

  • Why do ticket websites crash?

    - by Soloman Smart
    I hope this fits into the expanse of serverfault. Apologies if it doesn't. Why do ticket websites selling tickets for major concerts/events still crash when they make the tickets available? Surely, they know there is going to be huge demand and can ensure they have capacity to deal with that? May seem like a very simple question so sorry for those who understand! Thanks!

    Read the article

  • Online PIM Needed: EverNote + Cozi = ???

    - by Shoeless
    I am looking for an online solution with free-form note-taking ability like OneNote or EverNote, and also a robust calendaring system (tasks, repeating appts, notifications). Cozi has a great calendar but not much else... EverNote lasks the calendar side of things. Scrybe looks very promising on both fronts but is by invitation only. Oh yeah... it should be free, too :D Am I SOL?

    Read the article

  • wget not converting links

    - by acrosman
    I am trying to mirror a fairly large site (20,000+ pages) prior to a major overhaul. Basically, I need a backup before cutting over to the new one in case we forgot something we need (we'll have about 1,000 pages at launch). The site is run on a CMS that I cannot easily extract usable data from, so I'm trying to make the copy with wget. My problem is that wget does not appear to be actually converting links, despite the presence of --convert-links or -k in the command. I've tried a couple of different combinations of flags, but I haven't been able to get the output I need. Most recent failed attempt was: nohup wget --mirror -k -l10 -PafscSnapshot --html-extension -R *calendar* -o wget.log http://www.example.org & I've also included the --backup-converted, and --convert-links instead of -k (not that it have mattered). I've done it with and without -P and -l, again no that they should matter. Results in files that still have links like: http://www.example.org//ht/d/sp/i/17770

    Read the article

  • What WYSIWYG software can I use to create a web page?

    - by Roman
    I always made web pages by writing HTML code but now I would like to try to use some WYSIWYG approach. Can anybody recommend me a program which I can use for that? I mean a program in which you can move buttons, tables, pictures by mouse. You can change size and shape by mouse. You can use nice templates for "block of text", buttons, background and so on. I am using Windows 7. May be I already have something pre-installed?

    Read the article

  • Can't access some websites with any browser

    - by Charles Kingsmill
    I'm running Windows 7 64-bit on a new Samsung laptop and accessing the internet okay via ethernet cable to my university's ISP. Some sites work fine (e.g. google.com) but I can't access others at all (microsoft.com, topshop.com). I can't connect to those sites in safe mode with networking. And ping and tracert both fail. There's no proxy. Other users can connect successfully to these sites using my cable and socket. I've tried all the following with no success: using various browsers (IE9, FF, Chrome) creating a new user updating drivers clearing the DNS cache using OpenDNS and Google's DNS turning off Avast tweaking the MTU running MS malicious software removal tool running Spybot S&D reviewing the hosts file disabling the IPv6 options repairing / resetting winsock settings disabling advanced javascript options I have run out of ideas... can anyone see anything I've missed??!

    Read the article

  • Block a URL at browser level

    - by Farseeker
    Does anyone have a solution (that doesn't involve editing the hosts file) to block a particular URL from FireFox? Basic back story is that I'm trying to discipline myself. I'm spending FAR too much time over at Server Fault that I want to genuinely block the site from my work PC so that every time I find myself flicking to it during work time I can't see it, but I'd like to be able to disable it during my lunch break. (So I only spend 40 minutes a day there, rather than 4 hours). That said I don't want to block it at the router, nor for anyone else.

    Read the article

  • Xpath automatization software

    - by holms
    Too sad this topic was closed. But I'm kind of a having the same question. I want to construct xpathes, for common html block which appears on page. For example: you can give two URLs to that software, which will contain SAME html blocks (divs) , but having different content in it. by giving 2 stackoverflow.com url's, software could detect that same div#id is being used once again, and just give XPATH'es of those html blocks like for example. Of course I can find xpath'es my self, as far as I remember, firebug makes it easy,shows xpath of every html block, but this is kind of hard procedure if you want to get xpath'es for LOTS of html elements. so that's why I want this kind of software to help in this routine.

    Read the article

  • Command-line HTTP crawler for Windows?

    - by Pekka
    Would somebody have a recommendation for a web site crawler that can be invoked and equipped with settings from the command line? This would need to run in a Windows environment. Saving the data, following stylesheet links etc. is not an issue. I only need the crawler to start with a page, parse it, and follow all the links on the same domain so that in the end, all pages on the site have been requested once. Background: I'm setting up a web site that gets frequently uploaded from an office location. Combining data from various sources, it has several levels of caching. I don't want the first user to visit the site after a fresh upload to have to wait until the page has been generated and saved in the cache.

    Read the article

  • block certain websites from browser

    - by phunehehe
    Hello there, A friend of mine (who is not a geek) asks me how to stop her little brother from playing web games on her computer. She is currently using Chrome and IE, and I have never done that before, even on FF. I would prefer a solution that is simple and does not require additional applications. Although it seems unlikely, is there a solution that works for all browsers (i.e. do it once and I never have to fix it for a new browser)? Thanks.

    Read the article

  • My DNS cannot resolve an web site address?

    - by ipkiss
    Hello all, Recently, I could not access the webpage bbc.co.uk anymore, while I can access other websites smoothly. Ar first, I though there may be some problem with my laptop. However, if I use my laptop through my company network, I can load the page bbc.co.uk normally. Then, I though maybe my ADSL at home blocks that web address. However, I tried another laptop with my home ADSL and it can load the page bbc.co.uk very fast. Now I do not know what could be the problem. Can anyone tell me please? Thank you.

    Read the article

  • Client certificate based encryption

    - by Timo Willemsen
    I have a question about security of a file on a webserver. I have a file on my webserver which is used by my webapplication. It's a bitcoin wallet. Essentially it's a file with a private key in it used to decrypt messages. Now, my webapplication uses the file, because it's used to recieve transactions made trough the bitcoin network. I was looking into ways to secure it. Obviously if someone has root access to the server, he can do the same as my application. However, I need to find a way to encrypt it. I was thinking of something like this, but I have no clue if this is actually going to work: Client logs in with some sort of client certificate. Webapplication creates a wallet file. Webapplication encrypts file with client certificate. If the application wants to access the file, it has to use the client certificate. So basically, if someone gets root access to the site, they cannot access the wallet. Is this possible and does anyone know about an implementation of this? Are there any problems with this? And how safe would this be?

    Read the article

  • wget not converting links

    - by acrosman
    I am trying to mirror a fairly large site (20,000+ pages) prior to a major overhaul. Basically, I need a backup before cutting over to the new one in case we forgot something we need (we'll have about 1,000 pages at launch). The site is run on a CMS that I cannot easily extract usable data from, so I'm trying to make the copy with wget. My problem is that wget does not appear to be actually converting links, despite the presence of --convert-links or -k in the command. I've tried a couple of different combinations of flags, but I haven't been able to get the output I need. Most recent failed attempt was: nohup wget --mirror -k -l10 -PafscSnapshot --html-extension -R *calendar* -o wget.log http://www.example.org & I've also included the --backup-converted, and --convert-links instead of -k (not that it have mattered). I've done it with and without -P and -l, again no that they should matter. Results in files that still have links like: http://www.example.org/ht/d/sp/i/17770

    Read the article

  • Multiple personalities for a blog

    - by Ralph Rickenbach
    I am using Blogger.com and service multiple websites. I would like to display the content of one single blog on all sites, using url's like blog.sitename.xxx and the sites corporate identity. They are rather different, but a solution that takes specific css would suffice as an absolute minimum. Better would be to have different templates. Any solution?

    Read the article

  • public family tree

    - by Remus Rigo
    Hi all Does anyone know a ancestry site that allows you to create a public profile or tree, so that other visitors can see your family tree. In all sites that I have found (dynastree.com, familylink.com, ancestry.com, genebase.com), if someone wants to see your family tree, they must be members or register. thanks

    Read the article

  • Caching Reverse-Proxy ISP Host for a Low-Bandwidth Server

    - by Casey
    I am building a webcam w/ HTTP server that will be running from a low-bandwith connection. The content on the site will be changing every 5 to 10 minutes. Instead of serving files directly from this connection, are there hosting companies that can act as a reverse proxy for my site? Therefore, if nobody is using the site, the local internet connection remains idle. And if I receive 1000 hits all at the same time, only one HTTP GET is required, and the hosting company (on a fat pipe) continues serving the other 999 requests? This doesn't sound like a very common usage model, but I feel like this would be the optimal solution to my situation.

    Read the article

  • DNS configuration for external/internal resolution

    - by FerranB
    HI, We have an internal web server which is available through Internet and from local network. The server is located in the local network. The current configuration is the following: To access through Internet you use http://webexample.com To access through Local Network you use http://myweb The main problem is that the local users cannot share links with external users. Thats a problem for us. I want to setup the following configuration: All users (local and Internet) access through http://webexample.com The local DNS server resolves http://webexample.com to the local Network IP (i.e 192.168.2.100) Any other suggestion? Which is the best way to override http://webexample.com resolution in Windows Server? It can be done on DNS server or it have to be done in hosts file?

    Read the article

< Previous Page | 92 93 94 95 96 97 98 99 100 101 102 103  | Next Page >