Search Results

Search found 68566 results on 2743 pages for 'rich internet application'.

Page 23/2743 | < Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >

  • Internet Explorer - selected language is changing to English when opening a new window

    - by Amit
    When opening a new window in IE8 or IE9 (doesn't matter if using a link or window.open), my selected keyboard language is changing to English (doesn't matter what was the previous selection, tried it with a few different languages). This doesn't happen for me in Chrome or Firefox (all the browsers are installed in their English version), and I tested it in Windows 7 and Windows 2008R2. Is there any way to avoid that? If there isn't - supposing the new window is within my website or application, is there a way to change it back?

    Read the article

  • .NET HTML Sanitation for rich HTML Input

    - by Rick Strahl
    Recently I was working on updating a legacy application to MVC 4 that included free form text input. When I set up the new site my initial approach was to not allow any rich HTML input, only simple text formatting that would respect a few simple HTML commands for bold, lists etc. and automatically handles line break processing for new lines and paragraphs. This is typical for what I do with most multi-line text input in my apps and it works very well with very little development effort involved. Then the client sprung another note: Oh by the way we have a bunch of customers (real estate agents) who need to post complete HTML documents. Oh uh! There goes the simple theory. After some discussion and pleading on my part (<snicker>) to try and avoid this type of raw HTML input because of potential XSS issues, the client decided to go ahead and allow raw HTML input anyway. There has been lots of discussions on this subject on StackOverFlow (and here and here) but to after reading through some of the solutions I didn't really find anything that would work even closely for what I needed. Specifically we need to be able to allow just about any HTML markup, with the exception of script code. Remote CSS and Images need to be loaded, links need to work and so. While the 'legit' HTML posted by these agents is basic in nature it does span most of the full gamut of HTML (4). Most of the solutions XSS prevention/sanitizer solutions I found were way to aggressive and rendered the posted output unusable mostly because they tend to strip any externally loaded content. In short I needed a custom solution. I thought the best solution to this would be to use an HTML parser - in this case the Html Agility Pack - and then to run through all the HTML markup provided and remove any of the blacklisted tags and a number of attributes that are prone to JavaScript injection. There's much discussion on whether to use blacklists vs. whitelists in the discussions mentioned above, but I found that whitelists can make sense in simple scenarios where you might allow manual HTML input, but when you need to allow a larger array of HTML functionality a blacklist is probably easier to manage as the vast majority of elements and attributes could be allowed. Also white listing gets a bit more complex with HTML5 and the new proliferation of new HTML tags and most new tags generally don't affect XSS issues directly. Pure whitelisting based on elements and attributes also doesn't capture many edge cases (see some of the XSS cheat sheets listed below) so even with a white list, custom logic is still required to handle many of those edge cases. The Microsoft Web Protection Library (AntiXSS) My first thought was to check out the Microsoft AntiXSS library. Microsoft has an HTML Encoding and Sanitation library in the Microsoft Web Protection Library (formerly AntiXSS Library) on CodePlex, which provides stricter functions for whitelist encoding and sanitation. Initially I thought the Sanitation class and its static members would do the trick for me,but I found that this library is way too restrictive for my needs. Specifically the Sanitation class strips out images and links which rendered the full HTML from our real estate clients completely useless. I didn't spend much time with it, but apparently I'm not alone if feeling this library is not really useful without some way to configure operation. To give you an example of what didn't work for me with the library here's a small and simple HTML fragment that includes script, img and anchor tags. I would expect the script to be stripped and everything else to be left intact. Here's the original HTML:var value = "<b>Here</b> <script>alert('hello')</script> we go. Visit the " + "<a href='http://west-wind.com'>West Wind</a> site. " + "<img src='http://west-wind.com/images/new.gif' /> " ; and the code to sanitize it with the AntiXSS Sanitize class:@Html.Raw(Microsoft.Security.Application.Sanitizer.GetSafeHtmlFragment(value)) This produced a not so useful sanitized string: Here we go. Visit the <a>West Wind</a> site. While it removed the <script> tag (good) it also removed the href from the link and the image tag altogether (bad). In some situations this might be useful, but for most tasks I doubt this is the desired behavior. While links can contain javascript: references and images can 'broadcast' information to a server, without configuration to tell the library what to restrict this becomes useless to me. I couldn't find any way to customize the white list, nor is there code available in this 'open source' library on CodePlex. Using Html Agility Pack for HTML Parsing The WPL library wasn't going to cut it. After doing a bit of research I decided the best approach for a custom solution would be to use an HTML parser and inspect the HTML fragment/document I'm trying to import. I've used the HTML Agility Pack before for a number of apps where I needed an HTML parser without requiring an instance of a full browser like the Internet Explorer Application object which is inadequate in Web apps. In case you haven't checked out the Html Agility Pack before, it's a powerful HTML parser library that you can use from your .NET code. It provides a simple, parsable HTML DOM model to full HTML documents or HTML fragments that let you walk through each of the elements in your document. If you've used the HTML or XML DOM in a browser before you'll feel right at home with the Agility Pack. Blacklist based HTML Parsing to strip XSS Code For my purposes of HTML sanitation, the process involved is to walk the HTML document one element at a time and then check each element and attribute against a blacklist. There's quite a bit of argument of what's better: A whitelist of allowed items or a blacklist of denied items. While whitelists tend to be more secure, they also require a lot more configuration. In the case of HTML5 a whitelist could be very extensive. For what I need, I only want to ensure that no JavaScript is executed, so a blacklist includes the obvious <script> tag plus any tag that allows loading of external content including <iframe>, <object>, <embed> and <link> etc. <form>  is also excluded to avoid posting content to a different location. I also disallow <head> and <meta> tags in particular for my case, since I'm only allowing posting of HTML fragments. There is also some internal logic to exclude some attributes or attributes that include references to JavaScript or CSS expressions. The default tag blacklist reflects my use case, but is customizable and can be added to. Here's my HtmlSanitizer implementation:using System.Collections.Generic; using System.IO; using System.Xml; using HtmlAgilityPack; namespace Westwind.Web.Utilities { public class HtmlSanitizer { public HashSet<string> BlackList = new HashSet<string>() { { "script" }, { "iframe" }, { "form" }, { "object" }, { "embed" }, { "link" }, { "head" }, { "meta" } }; /// <summary> /// Cleans up an HTML string and removes HTML tags in blacklist /// </summary> /// <param name="html"></param> /// <returns></returns> public static string SanitizeHtml(string html, params string[] blackList) { var sanitizer = new HtmlSanitizer(); if (blackList != null && blackList.Length > 0) { sanitizer.BlackList.Clear(); foreach (string item in blackList) sanitizer.BlackList.Add(item); } return sanitizer.Sanitize(html); } /// <summary> /// Cleans up an HTML string by removing elements /// on the blacklist and all elements that start /// with onXXX . /// </summary> /// <param name="html"></param> /// <returns></returns> public string Sanitize(string html) { var doc = new HtmlDocument(); doc.LoadHtml(html); SanitizeHtmlNode(doc.DocumentNode); //return doc.DocumentNode.WriteTo(); string output = null; // Use an XmlTextWriter to create self-closing tags using (StringWriter sw = new StringWriter()) { XmlWriter writer = new XmlTextWriter(sw); doc.DocumentNode.WriteTo(writer); output = sw.ToString(); // strip off XML doc header if (!string.IsNullOrEmpty(output)) { int at = output.IndexOf("?>"); output = output.Substring(at + 2); } writer.Close(); } doc = null; return output; } private void SanitizeHtmlNode(HtmlNode node) { if (node.NodeType == HtmlNodeType.Element) { // check for blacklist items and remove if (BlackList.Contains(node.Name)) { node.Remove(); return; } // remove CSS Expressions and embedded script links if (node.Name == "style") { if (string.IsNullOrEmpty(node.InnerText)) { if (node.InnerHtml.Contains("expression") || node.InnerHtml.Contains("javascript:")) node.ParentNode.RemoveChild(node); } } // remove script attributes if (node.HasAttributes) { for (int i = node.Attributes.Count - 1; i >= 0; i--) { HtmlAttribute currentAttribute = node.Attributes[i]; var attr = currentAttribute.Name.ToLower(); var val = currentAttribute.Value.ToLower(); span style="background: white; color: green">// remove event handlers if (attr.StartsWith("on")) node.Attributes.Remove(currentAttribute); // remove script links else if ( //(attr == "href" || attr== "src" || attr == "dynsrc" || attr == "lowsrc") && val != null && val.Contains("javascript:")) node.Attributes.Remove(currentAttribute); // Remove CSS Expressions else if (attr == "style" && val != null && val.Contains("expression") || val.Contains("javascript:") || val.Contains("vbscript:")) node.Attributes.Remove(currentAttribute); } } } // Look through child nodes recursively if (node.HasChildNodes) { for (int i = node.ChildNodes.Count - 1; i >= 0; i--) { SanitizeHtmlNode(node.ChildNodes[i]); } } } } } Please note: Use this as a starting point only for your own parsing and review the code for your specific use case! If your needs are less lenient than mine were you can you can make this much stricter by not allowing src and href attributes or CSS links if your HTML doesn't allow it. You can also check links for external URLs and disallow those - lots of options.  The code is simple enough to make it easy to extend to fit your use cases more specifically. It's also quite easy to make this code work using a WhiteList approach if you want to go that route. The code above is semi-generic for allowing full featured HTML fragments that only disallow script related content. The Sanitize method walks through each node of the document and then recursively drills into all of its children until the entire document has been traversed. Note that the code here uses an XmlTextWriter to write output - this is done to preserve XHTML style self-closing tags which are otherwise left as non-self-closing tags. The sanitizer code scans for blacklist elements and removes those elements not allowed. Note that the blacklist is configurable either in the instance class as a property or in the static method via the string parameter list. Additionally the code goes through each element's attributes and looks for a host of rules gleaned from some of the XSS cheat sheets listed at the end of the post. Clearly there are a lot more XSS vulnerabilities, but a lot of them apply to ancient browsers (IE6 and versions of Netscape) - many of these glaring holes (like CSS expressions - WTF IE?) have been removed in modern browsers. What a Pain To be honest this is NOT a piece of code that I wanted to write. I think building anything related to XSS is better left to people who have far more knowledge of the topic than I do. Unfortunately, I was unable to find a tool that worked even closely for me, or even provided a working base. For the project I was working on I had no choice and I'm sharing the code here merely as a base line to start with and potentially expand on for specific needs. It's sad that Microsoft Web Protection Library is currently such a train wreck - this is really something that should come from Microsoft as the systems vendor or possibly a third party that provides security tools. Luckily for my application we are dealing with a authenticated and validated users so the user base is fairly well known, and relatively small - this is not a wide open Internet application that's directly public facing. As I mentioned earlier in the post, if I had my way I would simply not allow this type of raw HTML input in the first place, and instead rely on a more controlled HTML input mechanism like MarkDown or even a good HTML Edit control that can provide some limits on what types of input are allowed. Alas in this case I was overridden and we had to go forward and allow *any* raw HTML posted. Sometimes I really feel sad that it's come this far - how many good applications and tools have been thwarted by fear of XSS (or worse) attacks? So many things that could be done *if* we had a more secure browser experience and didn't have to deal with every little script twerp trying to hack into Web pages and obscure browser bugs. So much time wasted building secure apps, so much time wasted by others trying to hack apps… We're a funny species - no other species manages to waste as much time, effort and resources as we humans do :-) Resources Code on GitHub Html Agility Pack XSS Cheat Sheet XSS Prevention Cheat Sheet Microsoft Web Protection Library (AntiXss) StackOverflow Links: http://stackoverflow.com/questions/341872/html-sanitizer-for-net http://blog.stackoverflow.com/2008/06/safe-html-and-xss/ http://code.google.com/p/subsonicforums/source/browse/trunk/SubSonic.Forums.Data/HtmlScrubber.cs?r=61© Rick Strahl, West Wind Technologies, 2005-2012Posted in Security  HTML  ASP.NET  JavaScript   Tweet !function(d,s,id){var js,fjs=d.getElementsByTagName(s)[0];if(!d.getElementById(id)){js=d.createElement(s);js.id=id;js.src="//platform.twitter.com/widgets.js";fjs.parentNode.insertBefore(js,fjs);}}(document,"script","twitter-wjs"); (function() { var po = document.createElement('script'); po.type = 'text/javascript'; po.async = true; po.src = 'https://apis.google.com/js/plusone.js'; var s = document.getElementsByTagName('script')[0]; s.parentNode.insertBefore(po, s); })();

    Read the article

  • Internet speed is suddenly slow only on my laptop, but it's normal in other devices

    - by Wael
    I have a TP-Link router connected to a ZTE modem, with 2 laptops, a tablet, 2 phones connected via the router's WiFi, and an additional desktop connected via ethernet to the router. Today, my laptop started to have a very slow connection to the internet, which at first I thought it was the operator's problem to find out later on that it works fine on other devices. I tried to connect directly via WiFi to the modem, but it was as slow. I cannot access facebook at all, google takes forever to do a search, and youtube barely works. The weird thing though, is that when youtube does work, the streaming is full speed. This happens also when I download a file! My browse is Firefox, but I used Chrome and IE9 with the same results. I work on Windows 7 Thanks for any advices.

    Read the article

  • MacBook can't use internet, but nslookup and ping both work

    - by Joel Coehoorn
    I have a user with a new high-end MacBook Pro that can't use the internet. He can connect to either our wired or wireless network and do things like browse file shares, but can get no further. When I brought the machine in for testing, I found that I could do an nslookup just fine, and I'm able to ping addresses returned by nslookup just fine. I'm even able to bring up web pages by entering the IP address into the address bar directly. However, when I try to ping the domain name rather than the IP address, it just sits there. So apparently I can either do name resolution or communicate with an address, but not both at the same time. Again, these symptoms occur on both the wired and wireless network. Other machines on our network, including a few other Macs, don't have this issue. Any ideas?

    Read the article

  • Is Internet routing (BGP) fully automated?

    - by Adal
    If all the routing tables on the Internet would be erased simultaneously, will the routers be able to rediscover them automatically? I'm having an argument with a colleague who says that the RIPE routing tables are essential, but I remember reading that if the tables disappeared, the BGP protocol will allow routers to rediscover working routes between nodes by querying their neighbors which in turn will query their neighbors until a working route will be detected. Then that route will be used to repopulate the routing tables. After a while, all the routes will be restored (not necessarily the optimal routes). Is that correct?

    Read the article

  • Internet Connection Sharing/FTP issues

    - by SirSkidmore
    I am currently using a Linux Mint desktop along with a Windows 8 netbook running Internet Connection Sharing to my desktop. On my desktop, I can't access FTP sites, but my laptop can, so I think it might be a porting issue. I can ping the server from Mint, so I know it's up and running, but I can't access it via telnet. On my Windows 8 netbook, I have every protocol checked, including FTP. Originally, the FTP server indicated that "Scotty" (my netbook) was hosting the service, so I tried inputting the IP of my router, 192.168.1.1 to no avail. Any ideas?

    Read the article

  • Turnkeylinux lampp guest doesn't have internet connection

    - by dave08
    I've set up a turnkeylinux lampp server in virtualbox with two network bridged connections, one for when I'm plugged in to my router, and one for when I'm using a wireless connection. This allowed me to pull up the turnkey control panel on the host machine's browser, but when I try going into the command prompt in the guest and run an apt-get update, it doesn't seem to have an internet connection, even though it seems to have a connection with the host. What could be wrong? Thank you very much in advance for any answers...!

    Read the article

  • How share internet connection between two laptops

    - by danielgratzz
    I have what appears to be a cable modem plugged into the wall with only ONE ethernet port on it. Therefore i can only connect one computer to it. Also, it has no wireless capability. I have to dialup the connection and enter a username and password on my laptop. But how can i share this internet connection between two laptops running windows 7 ultimate? I have spare ethernet cables if that would help... Please help, thank You.

    Read the article

  • How to simulate slow internet connection

    - by V-Light
    I currently deploy with GAE (google app engine) and I try to implement some AJAX validation. So I got a couple text-fields and "spinners" (ajax loaders) which should be displayed when an AJAX request is sent. But I deploy on my local computer (localhost), so the GAE SDK reacts very fast on any request. It takes about 50-70 ms(miliseconds) to perform the whole ajax request, which is far far away from the real. Is there a way to somehow simulate slow Internet connection? I just want to see how my "spinners" work. I want to test some ajax setting (jquery) about timeouts, errors and so on... Any ideas ?

    Read the article

  • Embedded video is not shown in internet explorer 9 RC or Beta

    - by Jagannath
    In IE 9 RC or in beta, the embedded video is not shown in the web page. I verified the security settings in "internet options". Did not find any issue. Please find the screen shot below. I am able to view the flash videos in Youtube. But, for some reason, in some sites the embedded video is not shown. I don't have this issue with firefox. UPDATE: I am able to see view the embedded video in Admin account. But not able to view it in Standard account.

    Read the article

  • Internet connection sharing: Ubuntu 9.10 Server on Windows 7 and VMWare

    - by avesse
    I'm trying to get Internet Connection Sharing (ICS) working between my Windows 7 RTM host and a Ubuntu 9.10 Server running on VMWare Workstation 6.5, but I have not been able to get it right. Here's what I have done: Configured VMWare to use Host Only networking (I tried NAT as well). Enabled ICS on my host's network connection, allowing VMnet1. After enabling it, Windows informed me that its VMnet1 IP has changed to 192.168.137.1. So in VMWare's Virtual Network Editor I configured VMnet1: Subnet 192.168.137.0 Mask 255.255.255.0 I did the same for DHCP. For NAT i set 192.168.137.1 as Gateway. I cannot ping any sites or get access through apt-get/aptitude install/update, although domains do get resolved to IPs. I have also tried using a static IP in Ubuntu. I don't know if it makes a difference, but my external IP is locked to my host's MAC address.

    Read the article

  • Internet wireless connected with limited access, windows vista

    - by r0ca
    I had some malware in my computer so I did a bit of manual work to remove it including resetting TCP/IP. Now the malware is gone. I can see my home wireless network and I can get connected to it but when connected I get the Internet wireless connected with limited access message. When I go to the IE I cannot browse. When I tried to ping 192.168.1.1 I got an Error Code 1231 Unconnected Network Problem. I have deactivated my Windows firewall as I thought it could be hyperactive security. Still no luck. I have Norton but it is not active, I have also Avast and AVG installed but they are not active. Any ideas?

    Read the article

  • Windows Server 2008 R2 slows internet speed

    - by Tone
    I just installed Windows Server 2008 R2 as my main file server on my home network. I've noticed that often times when I start my day my internet connection speed is slow. I'll go to Speakeasy speed test and it'll be at about 25% of its normal speed. When I restart my Server 2008 machine it increases back to normal. It will stay normal until Server 2008 has been running for a while. Any ideas? Edit: I had installed Collabnet Subversion within the past week which installs/sets up some other stuff for web access, I just uninstalled it. I'll report back tomorrow if that fixed my problem.

    Read the article

  • Windows XP problems displaying internet browser backgrounds correctly

    - by Samurai Waffle
    My friend has a Windows XP computer that doesn't show the colored background on pages, it's always white. On top of that some pictures won't show up, there will just be an empty white frame. Also when you left click on a folder, instead of opening it up, it opens up a new window that turns out to be the search results window. I've never heard of these problems before, and I can't find any information on the internet about it. I assume it's a virus deeply imbedded into the system, but no virus scanner has found it. Thanks for the help!

    Read the article

  • In Ubuntu, MoBlock makes it take a while to actually start using internet

    - by Matchu
    When connecting to wireless internet in Ubuntu (tested with two different networks), I connect nearly instantly. However, to actually load a page, I need to wait a few minutes, at which point I can actually use a web browser or Pidgin. Until then, various applications try to connect until they time out. I've discovered that, if instead of waiting a few minutes, I open Terminal and run sudo /etc/init.d/blockcontrol stop, everything suddenly is able to load. I can then start MoBlock again with no ill effects. Why is this happening? What is it that would cause MoBlock to take a few minutes to start letting traffic in, but only when started on bootup? Thanks!

    Read the article

  • Internet connection very slow after Linksys configuration

    - by NLV
    Hello We have this network setup Server1 - DHCP server, Domain Controller, AD Lease line for Internet connection From lease line to Linksys router (we dont use wireless though) From linksys to Netgear (24 port Switch) and vonage (VoIP) Netgear to all our machines We configured Linsys with the static IP and DNS server addresses our ISP gave and we have routed it correctly. All our work machines are configured with Get IP automatically DNS server addresses our ISP gave The problem is that none of the sites are getting opened promptly. It is taking around 5 minutes to load google.com. But we are able to ping all the sites. What could be the problem?

    Read the article

  • Windows 7 Utlimate x64 - No Internet access

    - by rafek
    I've just installed Windows 7 Ultimate x64. As all of my computers at home I've connected to my router via wifi (open/wep). And what has happened is that Wireless Connection is Connected but with No Internet access -- my ipconfig says: Autoconfiguration IPv4 Address : 169.254.33.161 Subnet Mask : 255.255.0.0 Default Gatewawy : (empty) After ipconfig /renew I get: An error occurred while renewing interface Wireless Network Connection : unable to contact your DHCP server. Request has timed out. [..] I'm looking for a solution for 2 hours now...

    Read the article

  • Sendind an internet radio singal from my hifi to a portable

    - by Paul
    I'm just about to set myself up with a wireless network in my house. This is so that I can intregate an internet radio into my hifi system. What I would love to do is to listen to the radio in another room of the house. I also have a little portable radio/cd player that has a USB port on the front. Is there something I could buy which would allow me to listen to the radio through my portable in another room? I do realize that I could solve this problem by buying some wireless portable speakers, however I just wondered if anybody knew another way i.e. bluetooth or something similar?

    Read the article

  • Sharing Internet Connection in Windows 7 is so much more frustrated than Windows XP

    - by Phuong Nguyen
    Back to the time of Windows XP, from Properties dialog of my Wireless Connection, I can enable sharing and then select LAN network from the Drop Down List and boom, I can share it with my friend. We just need a LAN cable (either cross or not-cross is OK) and his Laptop will get an auto IP to gain access to internet. But now with the new Windows 7, everything starts to suck. I cannot see the Drop Down List any more in the sharing panel and my friends Laptop cannot get an automatic IP anymore. Am I doing anything wrong over there? How can I gain back the peace I used to have with Windows XP?

    Read the article

  • Share internet with my phone?

    - by Kenneth Cochran
    Most people want to use their cellphone as a modem for their computer, commonly referred to as 'tethering'. I'm actually interested in doing the opposite: Sharing my landline internet connection(which is much faster than any 3G service) with by cellphone. My phone is a Verizon BlackBerry Curve 8330 and it has USB and bluetooth connections. I know both USB and Bluetooth are capable of supporting tcp/ip traffic what's not so clear is: Is IP over USB or Bluetooth standardized? Is it supported on my phone? Has my cellphone company crippled my phone to prevent me from using it?

    Read the article

  • Sharing Internet Connection in Windows 7 is so much more frustrated than Windows XP

    - by Phuong Nguyen
    Back to the time of Windows XP, from Properties dialog of my Wireless Connection, I can enable sharing and then select LAN network from the Drop Down List and boom, I can share it with my friend. We just need a LAN cable (either cross or not-cross is OK) and his Laptop will get an auto IP to gain access to internet. But now with the new Windows 7, everything starts to suck. I cannot see the Drop Down List any more in the sharing panel and my friends Laptop cannot get an automatic IP anymore. Am I doing anything wrong over there? How can I gain back the peace I used to have with Windows XP?

    Read the article

  • disable 250 character URL limit in Internet Explorer

    - by Keltari
    Users of a SharePoint Document Library are getting this error: The URL for this file is too long for the application. A temporary copy of this file will be opened on your computer. You must save this copy as a new file. After doing some research, it appears Internet Explorer has a limit of about ~250 characters for a URL. Some URLs provided by SharePoint far exceed this limit. One example being 790 characters long. Is there a way to disable this limit? I have looked, but there doesnt appear to be a solution, other than shortening the folder/path names.

    Read the article

  • Approach for monitoring internet backbone traffic volume

    - by Greg Harman
    I'm interested in getting a picture of relative volume across different internet backbones. In particular, I'd like to see how traffic volume over a given route differs over the course of a day or from one day to the next. InternetTrafficReport.com is the closest approximation to this that I've found online, and their approach is to test ping times to a number of key routers from several geographically-dispersed servers. This sounds like one straightforward way to measure, but I don't have several geographically-dispersed servers. Is there a different approach for sampling this type of information from a single server?

    Read the article

  • Can't connect Alienware M11x wireless to internet thru families router

    - by Jim Kron
    Morning All, Have an Alienware M11x loaded with Win 7 Premium with the Dell half card wifi. Also have a Netgear and Belkin USB external adapters (b/g and N to include dual radios. No joy either. Families Internet is served thru Charter and they use a Motorola Router. No matter if we reset the router, I cannot connect to the Net but can talk to the router. BTW... my brother only uses WEP as a number of connected items are old and my folks are not in a high threat area for attacks. Frustrated, as I know what I'm doing but this really has me stumped. Any thoughts? Much appreciated, Jim

    Read the article

  • Internet Explorer 10 auto-correct randomly capitalises words

    - by Andreas Rejbrand
    I use Internet Explorer 10 on my Windows 7 laptop (because the fingerprint browser add-on only works in IE). I have noticed that IE 10 has a built-in spellchecker, which is great. Now, I write about 90 % English and 10 % Swedish on web sites, so it is somewhat unfortunate that it isn't possible to change the spell-checker language 'on-the-fly'. But that's a minor issue. The main problem is that, when I write English comments, the spell-checker (apparently randomly) capitalises some Words. For instance, the 'words' of the previous sentence was changed automatically to 'Words'. Another Word that is Always capitalised is Before. (Yes, the Three Words that are incorrectly capitalised in the previous sentence are due to the auto-correct 'feature', and apparently 'three' is also a problematic word.) Is this really something that happens to every (bilingual) user? Is there any way to fix the issue (preferably without disabling the spell-checker altogether)?

    Read the article

< Previous Page | 19 20 21 22 23 24 25 26 27 28 29 30  | Next Page >