Search Results

Search found 13705 results on 549 pages for 'browser'.

Page 329/549 | < Previous Page | 325 326 327 328 329 330 331 332 333 334 335 336  | Next Page >

  • Internet Explorer 9 send page by email grayed out

    - by catester
    Internet Explorer 9 in Windows 7 Home Premium 64-bit has the items Send Page by email and Send link by email grayed out. The default email client is Windows Live Mail 2011. I have used the "Set Default" dialogs to do that. Everything else recognizes Windows Live Mail 2011 as the default, including Firefox 4. I have duplicated this problem on two computers here. I have changed the default email program to Mozilla Thunderbird on one of the computers, and IE9 won't let me Send Page by email or Send link by email there, either. Has anyone gotten this to work? Firefox is my default browser, but sometimes I have to use IE9.

    Read the article

  • Why does Windows Media Center try to open zip files?

    - by gpryatel
    Notes: OS is windows 7, browser is latest firefox. After saving a zip file to the desktop, Windows Media Center opens up. I looked around its config settings but could not find anything related to zip files. How do I turn that off? Also, don't know if this should be a separate question or not: Unless I right click save link as... for zip files, I don't get a firefox dialogue asking what to do with the file (Open/Save). The files get saved to some place like c:\users\namegoeshere\appdata This only happens on the win7 computer. I looked around in firefox's settings for saving files, and I do have 'ask me where to download...' enabled. I can get more exact path names when I get home.

    Read the article

  • Life Cycle Navigator?

    - by C.W.Holeman II
    In many environments the file system directory structure and naming conventions attempt to allow one to use a file manager to navigate the life cycle of a document. This overloading of functions makes it difficult for users to handle the complexity. A file browser is a tool that lets the user navigate among files located in a directory structure to find a specific file. Whereas, when given a specific file, a life cycle navigator is a tool that lets the user navigate its life cycle from source to published copy and across versions. Does a Life Cycle Navigator exit? I see a user pointing at an object: Left mouse button displays the document Right mouse button has a Life Cycle Navigator (LCN) The LCN displays a tree for a specific document within a file manger, for example: Published 3.2 Current 3.1 3.0 +2.x +1.x +Archived +All Source Draft 3.2 Current 3.1 3.0 +2.x +1.x +Archived +All +Work Flow +Properties Or from a command line: $ lcn x.pdf --open_source_document | my_favorite_editor $ lcn x.pdf --show_published_version_info $ lcn x.pdf --show_previous_publish_versions_info See also, Life Cycle Navigator.

    Read the article

  • Installing Bugzilla on Ubuntu 9.04 and Plesk

    - by makeflo
    Hey guys. I'm trying to install the latest Bugzilla version on my ubuntu server. (Want to use a subdomain like bugs.domain.com) I already installed all necessary perl modules and check_modules.pl doesn't show any errors. But when I'm running the testserver.pl script I get the following: TEST-OK Webserver is running under group id in $webservergroup TEST-FAILED Fetch of images/padlock.png failed I'm also not able to visit ANY file within the bugzilla folder from the browser. I'm always getting a 404 error. The bugzilla folder and all containing files are set to apache as the owner. I tried to enter the apache configuration form the installation guide in the http.include file of the domain and in the vhosts.conf file of the subdomain as well. I don't know what to do... Playing with plesks' suexecgroup doesn't bring any solution... I hope you can help me! Thanks in advance!

    Read the article

  • Trouble with mod_proxy and mongrel_rails

    - by x3ro
    Hey there I'm trying to set up a mod_proxy - mongrel combination, but somehow, apache/mod_proxy is unable to access mongrel locally. The following is my configuration for mod_proxy: ProxyRequests Off ProxyPreserveHost On <Location / ProxyPass http://localhost:3000/ ProxyPassReverse http://localhost:3000/ Order deny,allow Allow from all </Location Mongrel/Rails ist running just fine, because I can access it from my browser, and even with lynx on the server. However, I get the following error when trying to use the proxy: [error] [client 127.0.0.1] Invalid Content-Length I would appreciate any help :D PS: Oh, and the server is running Plesk to configure vhosts, if thats important.

    Read the article

  • how to create a simple radio station [on hold]

    - by John
    I've been digging around for ages but not getting very far so any links or tips would be massively appreciated. I want to create a central "radio station" in my home to stream one playlist to any computers pointing their browser to the ip within my internal network. I have an old mini slave mac mini running ubuntu and was originally thinking I could get php and apache to handle this but then quickly realised that of course, php will serve out streaming independently per connection ie no radio station. Are there any servers already built for this sort of behaviour? is shoutcast one of the only versions Thanks, John

    Read the article

  • Windows share mounted then symlinked on LAMP server. Serves up html, but not images.

    - by Samuurai
    This has really got me befuddled... I've mounted a share, like this: //srv1/UserUploads /mount/UserUploads cifs rw,user,exec,uid=wwwrun,gid=www,username=shareuser,password=sharepw 0 0 I then have a symlink here: WEBSVR:/Web/htdocs/public_html # ls -l useruploads lrwxrwxrwx 1 wwwrun www 18 Dec 7 09:18 useruploads -> /mount/UserUploads Oddly, if I ls inside the mounted area, items appear with a capital S -rwxrwSrwx 1 wwwrun www 4077 Dec 30 14:54 prop9.jpg -rwxrwSrwx 1 wwwrun www 4 Jan 12 15:57 test.html And if I bring up test.html in a browser, it works fine, but if I go to prop9.jpg, chrome gives me this error: This web page is not available. The web page at http://10.1.64.100/useruploads/webteam/help2let/prop6-1.jpg might be temporarily down or it may have moved permanently to a new web address. More information on this error Below is the original error message Error 100 (net::ERR_CONNECTION_CLOSED): Unknown error. Has anyone seen this behaviour where the binary files (images) arent displayed, but html/text is?

    Read the article

  • In 2011, what are the reasons to stick with plain text mails?

    - by Aaron Digulla
    People entering college today have never known a world without an Internet. HTML was invented 1980, that's more than thirty years ago or 1.5 generations. But plain text mails are still common despite all their problems: Encoding issues Wrapped code segments No links No way to use the "a picture says more than a thousand words" lore Most of the security risks are now handled by the underlying browser engine and smart settings like: Don't allow JavaScript in mails Don't execute attachments Don't download external resources (like web bugs) On top of that, only very few people still read mail only in command line tools like Mutt. Knowing Mutt myself, I'm pretty sure you can configure it to display HTML mail with, say, w3m. On top of that, most HTML mail capable clients send two versions of the mail (pure text with an HTML attachment). I'm not sure if there are any people left on the planet which still use a 56kbit modem to access their mail accounts. So what reasons are left to stick with plain text mails in 2011?

    Read the article

  • Does there exist video chat software which works over a LAN between different types of devices?

    - by Graphics Noob
    What I'm trying to do is set up a local area network, without internet access, which allows the users to video chat with each other. The connected devices will include Linux and Android devices, so software which will run with just those two types of systems will work, although running through a browser would be optimal. The most promising lead I've found so far is camfrog, which has a video-chat app for android and a video chat server for linux. The problem is that the documentation for the server is non-existant, and I don't know if the android app can directly connect to the video chat server over a LAN or if it can only connect to camfrog's video chat server over the internet.

    Read the article

  • How can I share data from a Samsung Wave Mobile phone with the Mac OS?

    - by M. Bedi
    This is driving me up the wall just a bit. I have a new Samsung Wave mobile phone. It is running the Bada OS. The mobile phone does not come with anyone software for the Mac. The Kies desktop interface is only available for Windows. I did try installing the Kies software in a VM with Parallels 5 but it did not detect the phone connected via USB. I tried using Bluetooth file exchange on the Mac; it lets me browse the file system on the Wave phone but not actually see any of the media files; I just get empty directory views. But I am able to access files using media sharing with the Wave phone and my PS3. So what would be a Mac desks top app that can be used a media sharing browser?

    Read the article

  • How Can I Edit Google Chrome's 'History Provider Cache' File?

    - by dissolved
    I'm interested in editing (not completely deleting) contents of some of Google Chrome's cache files. In particular, the 'History Provider Cache' (found in ~/Library/Application Support/Google/Chrome/Default on Mac). As this other question suggests, it appears to simply be a SQLite file. Unfortunately, when I try to open it using a SQLite browser (MesaSQLite) I'm asked for an encryption key. So, I'd welcome any suggestions on how to either (1) determine the encryption key, or (2) an alternate way to edit this file. The end goal is to be able to remove specific annoying suggestions in the Omnibar. I've read countless other techniques, but none seem to remove suggestions that have the clock icon next to it. Some say deleting this file entirely will do the trick (and I imagine it will), but I don't wish to trash my entire browsing history. I find most of the suggestions to be useful and helpful, and I'd like to preserve that.

    Read the article

  • Distribute Nagios to reduce false alarms

    - by GDR
    I'm currently running a single Nagios instance. From time to time, I'm getting false alarms about timeouts - for example, it says that HTTP is down on some server, but when I open it in my browser several seconds later, it loads fast, and in general there is no trace of an error. What can I do to reduce such false alarms? I'm guessing that it's because of transient network issues on my monitoring server. I guess that setting up another monitoring server on a different network would greatly help, but how do I plug it into Nagios? Is it at all possible with Nagios or do I have to switch to another monitoring system? I like my configs and, if possible, I'd like to stay with Nagios or something compatible (Icinga?)

    Read the article

  • New Secure Website with Apache Reverse Proxy

    - by jtnire
    I wish to set up a new website that will be accessed by users using HTTPS. I think it is good practise to put the "real" web server in a seperate subnet, and then install an Apache Reverse Proxy in a DMZ. My question is, where should I put the SSL cert(s)? Should I a) Use a self-signed cert on the "real" web server, and a proper cert on the reverse proxy? b) Use 2 real certs on both the "real" web server and the reverse proxy? c) Don't use any cert on the "real" web server, and use a proper cert on the reverse proxy? I'd like to use a) or c), if possible. I also don't want anyone's browser complaining of a self-signed cert. Thanks

    Read the article

  • Safari, IIS and optional Client Certificates

    - by Philipp
    I've a ASP.Net Webapp running on IIS7.5. The Webserver is configured to accept Client Certifcates. Unfortunately Visitors with Safari Browser are unable to view the Page. Same Problem as described under the following link: http://www.mnxsolutions.com/apache/safari-providing-an-ssl-error-client-certificate-rejected%E2%80%9D-when-other-browsers-work.html Does anyone knows how to solve this? I'd really appreciate your help. edit: Seems to be the same problem: http://superuser.com/questions/231695/iis7-5-ssl-question-safari-users-get-a-prompt-of-certificate-to-select

    Read the article

  • Polling performance on shared host

    - by Azincourt
    I am planning on writing a small browser game. The webserver is a shared server, with no root / install possible. I want to use AJAX for client/server communication. There will be 12 players. So each player would be polling the server for the current game status every X milliseconds (let's say 200ms). So that would be 200ms x 12 players x 5 = 60 requests per second Can Apache handle those requests? What might be the bottlenecks when using this attempt?

    Read the article

  • VLC - Play two mp3s simultaneously from command line

    - by raoulcousins
    I'm using VLC 2.0.8 on Windows 7. How do I play two mp3s simultaneously from the command line (command line because to write a batch script that launches the mp3s)? I've tried vlc 1.mp3 2.mp3 and vlc 1.mp3 --input-slave 2.mp3 (I've seen the second one as a way to play a video file and a separate audio file simultaneously). Both of these just launch 1.mp3. Not important, but if you're wondering, the mp3s are respectively cafe sounds and rain sounds, so I can play sounds similar to those found at http://rainycafe.com/ without having to launch a browser.

    Read the article

  • FeedValidator & Feedburner get 404 when accessing wordpress RSS feeds when permalinks are enabled.

    - by Wazbaur
    I'm helping a friend set up a self-hosted Wordpress blog + feedburner and I'm seeing a problem with the feeds that I'm finding somewhat mysterious. Using the default permalink structure (e.g., ?p=123) everything works as expected; I can follow the feed in Google reader, navigate to it manually, and set it up in feedburner. However, once I switch away from the default permalink structure, feedburner and feedvalidator both report that accessing the feed is returning HTTP-404 and Google reader no longer shows new posts (I'm assuming for the same reason), but I can navigate to the feed using a browser. When I do that it appears as though nothing is wrong; there is a feed there and it contains all the posts I expect it to have. I've re-started the feedburner & reader set-up from the beginning after changing the link structure, so I don't think they're doing anything silly like looking at the feed at its old address. I've seen people with similar problems in various other places but there doesn't seem to be a good answer anywhere.

    Read the article

  • Lots of files being used by blank web page. What are they?

    - by byronyasgur
    I am trying to optimise a website and I was using the network waterfall facility in Google Chrome. When I looked at the results there were lots of files which I didnt recognise. I first thought they might be something to do with Google Chrome itself, so I put a blank HTML file on my desktop and checked but there was nothing in the waterfall except the file itself. So I put a blank file on my server and I got the output below. What are all these files, are they all necessary, is this normal and do I need to be in any way concerned. My hosting provider has always been excellent in every regard that I'm aware of. My host is shared hosting, using cpanel and is based on a LAMP server. I also note that a couple of those file have problems but I have no idea how to fault find that or whether it's a concern. EDIT: I have cleared the cache so I don't think it's a browser cache issue.

    Read the article

  • HTTP redirects showing ip address

    - by DrKarl
    I have a domain name on 1&1 and a VPS on Linode. I noticed that my site was enclosed in a frameset which I didn't create. I checked nginx and jetty in the VPS but none of them created the frameset. Then I checked the domain control panel in 1&1 and saw that the redirection could be a frame redirect or an http redirect. I changed to http redirect and the frameset was gone, everything was fine except for the fact that in the url bar of the browser it changed to the ip address of the server instead of my domain url. How can I avoid the frameset and still have the proper url displayed instead of an IP?

    Read the article

  • Why can't I connect to a wifi network with my laptop, when I can with my phone?

    - by Alex Sf.
    I can connect with my phone and use the browser as usual. On my laptop it won't connect when using windows 7 while in ubuntu it will connect, but with no internet. What is going on here and how can I get internet on my laptop ? [edit] It's a public wifi hotspot. I can connect with no issues at home. My network asapter is: Atheros AR5B97. And my phone is an iPhone 3G. The wizard's of no help since it asks me to check the router, which I can't since it's a public hostspot.

    Read the article

  • Unable to get to remote samba share

    - by tubaguy50035
    I have a remote VPS that I would like to setup samba on and only allow my IP access to it. I currently have in my smb.conf: [global] netbios name = apollo security = user encrypt passwords = true socket options = TCP_NODELAY printing = bsd log level = 3 log file = /var/log/samba/log/%m debug timestamp = yes max log size = 100 [hosting] path = /hosting/ comment = Hosting Folder browseable = yes read only = yes guest account = yes valid users = nick I have the ports (137,138,139,445) open in iptables (they're open to everyone right now while I debug) and I see nothing in the syslog about iptables blocking my requests. When I try to open a file browser to my address \\ipaddress, it hangs for a good thirty seconds, and then opens a log in box. I enter my user name and password for the server, hit okay. It then opens the same box, I enter my credentials again and hit enter. Windows then tells me it could not connect. My user account is added to Samba already. Anybody have any suggestions what I can do to get this working?

    Read the article

  • How do we increase the maximum allowed HTTP GET query length in Jetty?

    - by Mike
    We are using Jetty to run an Apache Solr index. We've had some queries that have grown way beyond the previously expected maximum length, and are now having issues wehre most queries are not returning any data because the URL gets truncated. These requests are not being made through a browser, they're being made programmatically using the Apache_Solr_Service PHP library. The application is expecting queries to come in as HTTP GET requests, so simply switching to a POST will not solve this problem. How can we increase the maximum allowed HTTP GET query length in Jetty? Thanks!

    Read the article

  • Laptop loses signal from WiFi router, but mobile phone holds it fine

    - by Anton
    Hi, I have an extremely weird issue with my WiFi router. Both Ubuntu & Windows 7 can connect to it fine, but after 5-10 minutes browser (any one) stops opening pages and tells me it cannot resolve host address. But, at the same time, tools like Skype or BitTorrent work without any issues. I can also browse Internet on my mobile phone connected to the very same router. If I reset router it helps, but after 5-10 minutes I see just the same problem... Ubuntu tells me that WiFi signal is lost (mobile sees it), Windows 7 just won't let me browse anywhere. Can anyone give me a suggestion on this, please? Thanks

    Read the article

  • What are possible causes of keyboard lag on my desktop machine?

    - by Jer
    I am running Windows 7 and began experiencing keyboard lag in most applications, and it seems to be getting worse. Certain websites are the worst - on some, I can type a sentence, take my hands off the keyboard, and watch the characters continue to appear on the screen for several seconds. Others are not as bad, but still noticeable and annoying. I just started noticing it in non-browser applications (e.g. Outlook) as well. I've disabled all extensions in Firefox, rebooted my machine, and that did nothing. There is nothing using much memory or cpu cycles, even when the lag is occurring. This is a machine at work with very strict controls over what can be installed, so the chances of any kind of malware are very slim. I don't believe anything as been installed since before the problem started. What could be causing this, and/or what can I do to debug?

    Read the article

  • Sniff packets using tcpdump

    - by denisk
    I have a completely noob question. I want to see all packets that come to my computer from particular site (google.com). So I start tcpdump sudo tcpdump -i eth0 host google.com and enter google.com in a browser and hit enter - nothing gets captured. I can't figure out why it happen. What do I do wrong? Edit It appeared that I was listening to the wrong interface. I had changed eth0 to any and it worked. It was ppp1 that needed listening. Thanks for your answers!

    Read the article

< Previous Page | 325 326 327 328 329 330 331 332 333 334 335 336  | Next Page >