Search Results

Search found 7920 results on 317 pages for 'drupal cache'.

Page 213/317 | < Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >

  • Disable thumbnails in Windows XP?

    - by quack quixote
    I regularly use Windows Explorer to browse my drives and data, and I notice little freezes and hiccups at times. It's especially noticable when browsing local or network folders with lots of video files (AVI, MKV, MPG, etc). I almost always browse in Details view, and the "Do not cache thumbnails" option is turned on. Even though I'm in "Details" mode, I'm convinced the sluggishness is due to Windows trying to generate thumbnails for the video files, so I want to disable thumbnail generation for these files. I occasionally use Thumbnails view for browsing image files, so I don't want to disable all thumbnails. But for future reference, this might be good to know. So I have three questions about disabling Windows thumbnail generation: How do I disable thumbnail generation for all non-image files? How do I disable thumbnail generation for all files? How do I disable thumbnail generation for one specific filetype? And finally, how do I undo (re-enable thumbnails) once I've performed one of the above?

    Read the article

  • Shared files folder in Amazon Elastic Beanstalk environment

    - by por
    I'm working on a Drupal application, which is planned to be hosted in Amazon Elastic Beanstalk environment. Basically, Elastic Beanstalk enables the application to scale automatically by starting additional web server instances based on predefined rules. The shared database is running on an Amazon RDS instance, which all instances can access properly. The problem is the shared files folder (sites/default/files). We're using git as SCM, and with it we're able to deploy new versions by executing $ git aws.push. In the background Elastic Beanstalk automatically deletes ($ rm -rf) the current codebase from all servers running in the environment, and deploys the new version. The plan was to use S3 (s3fs) for shared files in the staging environment, and NFS in the production environment. We've managed to set up the environment to the extent where the shared files folder is mounted after a reboot properly. But... The Problem is that, in this setup, the deployment of new versions on running instances fail because $ rm -rf can't remove the mounted directory, and as result, the entire environment goes down and we need restart the environment, which isn't really an elegant solution. Question #1 is that what would be the proper way to manage shared files in this kind of deployment? Are you running such an environment? How did you solve the problem? By looking at Elastic Beanstalk Hostmanager code (Ruby) there seems be a way to hook our functionality (unmount if mounted in pre-deploy and mount in post-deploy) into Hostmanager (/opt/hostmanager/srv/lib/elasticbeanstalk/hostmanager/applications/phpapplication.rb) but the scripts defined in the file (i.e. /tmp/php_post_deploy_app.sh) don't seem to be working. That might be because our Ruby skills are non-existent. Question #2 is that did you manage to hook your functionality in Hostmanager in a portable way (i.e. by not changing the core Hostmanager files)?

    Read the article

  • Java 6.0 Virtual Machine re-caches application on every load

    - by David Neale
    I have a Java application which is loaded and cached by the JRE and for most users it only needs to cache once unless the application software has changed. However, I have one computer that caches the entire application every time they load it. It is not the version of the JRE, I have that running on other machines. It also works on this machine if logged in as a local admin, just not as a standard user. Does anybody have any ideas on what might be causing this?

    Read the article

  • Gateway MX6440 CPU Upgrade

    - by BPugh
    I have received an a Gateway MX6440 laptop as a freebie, but I'm interested in upgrading its AMD Turion 64 ML-32 (socket 754) to something faster (and more cache). I know the range of processors that could work based on the family list in Wikipedia. However, this computer has the stock bios, and any updates I haven't applied from Gateway doesn't specify processor support. I'm looking to go to at least a 2.2 (ML-40). Has anybody upgraded the processor in this model or other in the series success or failure and do you happen to have any guides handy for working with the heat sink? Any Googling I have done keeps hitting RAM marketers. Update Computer died before I had a chance to try this out.

    Read the article

  • thttpd: Daemon exiting, I don't know why

    - by Tobe
    I run thttpd to serve some perl files. But for some reason the daemon is exiting every second or third day. Strangely it's always at 6.25 am. Here are some lines from syslog: Nov 10 06:25:40 b1 thttpd[6370]: up 86404 seconds, stats for 86404 seconds: Nov 10 06:25:40 b1 thttpd[6370]: thttpd - 25 connections (0.000289338/sec), 1 max simultaneous, 625000 bytes (7.23346/sec), 2 httpd_conns allocated Nov 10 06:25:40 b1 thttpd[6370]: libhttpd - 30 strings allocated, 8200 bytes (273.333 bytes/str) Nov 10 06:25:40 b1 thttpd[6370]: map cache - 0 allocated, 0 active (0 bytes), 0 free; hash size: 0; expire age: 1800 Nov 10 06:25:40 b1 thttpd[6370]: fdwatch - 20902 selects (0.24191/sec) Nov 10 06:25:40 b1 thttpd[6370]: timers - 2 allocated, 2 active, 0 free Nov 10 06:25:40 b1 thttpd[6370]: exiting Any ideas?

    Read the article

  • Adobe Plugin in Firefox showing grey screen; something to do with range requests on Apache?

    - by Sam Minnée
    I have a web page with a link to a PDF file (target="_blank"). If I click the link, the PDF reader just shows a grey screen within the Firefox browser. If I copy that link and manually open it in a new tab, the PDF will display correctly, and subsequent requests made by clicking the original link now work, suggesting that the problem occurs when loading the file into the cache. It appears as though the Adobe PDF reader plugin is making byte-range requests (I see lots of 206 responses) and I suspect that this may be the cause of the issue. I am running an Apache webserver. Has anyone had problems with Apache and Adobe's byte-range requests? Are there any workarounds?

    Read the article

  • Why am I unable to turn off recursion in ISC BIND?

    - by nbolton
    Here's my named.conf.options file: options { directory "/var/cache/bind"; dnssec-enable yes; auth-nxdomain no; # conform to RFC1035 listen-on-v6 { any; }; # disable recursion recursion no; }; I've tried adding allow-recursion { "none"; } before recursion but this also has no effect; I'm testing it by using nslookup on Windows, and using google.com. as the query (and it returns an IP, so I assume recursion is on). This issue occurs on two servers with similar setups.

    Read the article

  • How to automate kinit process to obtain TGT for Kerberos?

    - by tore-
    I'm currently writing a puppet module to automate the process of joining RHEL servers to an AD domain, with support for Kerberos. Currently I have problems with automatically obtain and cache Kerberos ticket-granting ticket via 'kinit'. If this were to be done manually, I would do this: kinit [email protected] This prompts for the AD user password, hence there is a problem with automate this. How can i automate this? I've found some posts mentioning using kadmin to create a database with the ad users password in it, but I've had no luck. Thanks for input

    Read the article

  • Browsers ignoring hosts file

    - by madkris
    Until recently my browsers started to ignore my hosts file. I have Windows 7 operating system installed. 192.168.0.5 livesite.com I have tried: Clearing browser cache Issued "ipconfig /flushdns" from the command line Issued "ping livesite.com" from the command line (response was "Reply from 192.168.0.5: bytes=32 time=1ms TTL=128") Restarting unit Backing up original hosts file and making a new one Checking lmhosts.sam (everything is commented out) Connecting directly to modem using cable Checked \HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\DataBasePath Tried it on another laptop with exactly the specs as I have Then I tried Changing entry to "127.0.0.1 livesite.com" (ping ok, browser ok) Changing entry to "192.168.0.5 livesite.com" (ping ok, browser ok but only for a sec) Issued "ipconfig /flushdns" from the command line (ping ok, browser not ok) Changing entry to "127.0.0.1 livesite.com" (ping ok, browser ok) Changing entry to "192.168.0.5 livesite.com" (ping ok, browser not ok) Issued "ipconfig /flushdns" from the command line (ping ok, browser not ok) Any idea why it worked for a moment? Or better yet anything I havent tried or some error I may have overlooked?

    Read the article

  • Browsers ignoring hosts file

    - by madkris
    Until recently my browsers started to ignore my hosts file. I have Windows 7 operating system installed. 192.168.0.5 livesite.com I have tried: Clearing browser cache Issued "ipconfig /flushdns" from the command line Issued "ping livesite.com" from the command line (response was "Reply from 192.168.0.5: bytes=32 time=1ms TTL=128") Restarting unit Backing up original hosts file and making a new one Checking lmhosts.sam (everything is commented out) Connecting directly to modem using cable Checked \HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Services\Tcpip\Parameters\DataBasePath Tried it on another laptop with exactly the specs as I have Then I tried Changing entry to "127.0.0.1 livesite.com" (ping ok, browser ok) Changing entry to "192.168.0.5 livesite.com" (ping ok, browser ok but only for a sec) Issued "ipconfig /flushdns" from the command line (ping ok, browser not ok) Changing entry to "127.0.0.1 livesite.com" (ping ok, browser ok) Changing entry to "192.168.0.5 livesite.com" (ping ok, browser not ok) Issued "ipconfig /flushdns" from the command line (ping ok, browser not ok) Any idea why it worked for a moment? Or better yet anything I havent tried or some error I may have overlooked?

    Read the article

  • OpenBSD in a virtual box as a firewall

    - by Ali
    Is there any merit in installing a virtual machine with OpenBSD and pf (or any other simple and secure OS + iptable) on a mac laptop and routing all the traffic through that machine? I read a similar set up for corporate laptops running windows (I thing I read this in BSD magazine). They claim that Windows machines are too hard to secure and if you are taking them to the wild (public wireless, hotels, ...) you'd better but a secure OS in between! If you think this is a good idea, how you route all the traffic on a mac through the virtual machine and prevent any application or service to go directly? I am not sure if just setting the gateway will do that, what about DNS? you don't want anybody to fool you with DNS cache poisoning or similar attacks either.

    Read the article

  • Unable to remove Read-Only attribute from folder in Windows XP

    - by elcuco
    I have this directory which I cannot remove the read only attribute from. The computer is running XP SP2 (or SP3, not sure) and the directory sits in a NTFS file system. Looking into the web I found this: http://support.microsoft.com/kb/256614 which tells that if the directory is "customized" it's treated as a system folder and thus "read only". I don't think this is a scenario in my case, but anyway it's not helping, their recommendation is more or less: attr -r -s /d /s d:\data and this is not working for me. Any other ideas? More info: The directory is served to an HTTP server (wamp) and the directory is an SVN check out. What happens is that the web server cannot write files into the directory (imagechace from drupal is you are really interested). Edit 2: The original post claimed that the directory sits on a VFAT FS, however I booted Fedora 11 from livecd and the partition is marked as NTFS. Edit 3: I left the company which I worked on, on which this situation happened... so I cannot fully close this question. But things get even worse: I tested the "attr -r" answer I put, it did not work for me, and now the developer said that it worked for her. A nice WTF moment. Probably a reboot helped... Sorry for loosing details. If anyone has the same problem, and one of the answers helps him - please comment.

    Read the article

  • How do we keep Active Directory resilient across multiple sites?

    - by Alistair Bell
    I handle much of the IT for a company of around 100 people, spread across about five sites worldwide. We're using Active Directory for authentication, mostly served to Linux (CentOS 5) systems via LDAP. We've been suffering through a spate of events where the IP tunnel between the two major sites goes down and the secondary domain controller at one site can't contact the primary domain controller at the other. It seems that the secondary domain controller starts denying user authentication within minutes of losing connectivity to the primary. How do we make the secondary domain controller more resilient to downtime? Is there a way for it to cache the entire directory and/or at least keep enough information locally to survive a multi-hour disconnection? (We're all in a single organizational unit if that makes any difference.) (The servers here are Windows Server 2003; don't assume that we set this up correctly. I'm a software engineer, not an IT specialist.)

    Read the article

  • how to remove the link from the following javascript?

    - by murali
    hi i am unable to the remove the link from the keywords which are coming from database.... var googleurl="http://www.google.com/#hl=en&source=hp&q="; function displayResults(keyword, results_array) { // start building the HTML table containing the results var div = "<table>"; // if the searched for keyword is not in the cache then add it to the cache try { // if the array of results is empty display a message if(results_array.length == 0) { div += "<tr><td>No results found for <strong>" + keyword + "</strong></td></tr>"; // set the flag indicating that no results have been found // and reset the counter for results hasResults = false; suggestions = 0; } // display the results else { // resets the index of the currently selected suggestion position = -1; // resets the flag indicating whether the up or down key has been pressed isKeyUpDownPressed = false; /* sets the flag indicating that there are results for the searched for keyword */ hasResults = true; // loop through all the results and generate the HTML list of results for (var i=0; i<results_array.length-1; i++) { // retrieve the current function crtFunction = results_array[i]; // set the string link for the for the current function // to the name of the function crtFunctionLink = crtFunction; // replace the _ with - in the string link while(crtFunctionLink.indexOf("_") !=-1) crtFunctionLink = crtFunctionLink.replace("_","-"); // start building the HTML row that contains the link to the // help page of the current function div += "<tr id='tr" + i + "' onclick='location.href=document.getElementById(\"a" + i + "\").href;' onmouseover='handleOnMouseOver(this);' " + "onmouseout='handleOnMouseOut(this);'>" + "<td align='left'><a id='a" + i + "' href='" + googleurl + crtFunctionLink ; // check to see if the current function name length exceeds the maximum // number of characters that can be displayed for a function name if(crtFunction.length <= suggestionMaxLength) { div += "'>" + crtFunction.substring(0, httpRequestKeyword.length) + "" div += crtFunction.substring(httpRequestKeyword.length, crtFunction.length) + "</a></td></tr>"; } else { // check to see if the length of the current keyword exceeds // the maximum number of characters that can be displayed if(httpRequestKeyword.length < suggestionMaxLength) { div += "'>" + crtFunction.substring(0, httpRequestKeyword.length) + "" div += crtFunction.substring(httpRequestKeyword.length, suggestionMaxLength) + "</a></td></tr>"; } else { div += "'>" + crtFunction.substring(0,suggestionMaxLength) + "</td></tr>" } } } } // end building the HTML table div += "</table>"; var oSuggest = document.getElementById("suggest"); var oScroll = document.getElementById("scroll"); // scroll to the top of the list //oScroll.scrollTop = 1; -- murali commented // update the suggestions list and make it visible oSuggest.innerHTML =div; oScroll.style.visibility = "visible"; // if we had results we apply the type ahead for the current keyword if(results_array.length > 0) autocompleteKeyword(); } catch(e) { } } how to remove href tag from the following snippet... when i remove anything the drop down vanishes.... pls help me how to remove the href from the above snippet

    Read the article

  • Need data on disk drive management by OS: getting base I/O unit size, "sync" option, Direct Memory A

    - by Richard T
    Hello All, I want to ensure I have done all I can to configure a system's disks for serious database use. The three areas I know of (any others?) to be concerned about are: I/O size: the database engine and disk's native size should either match, or the database's native I/O size should be a multiple of the disk's native I/O size. Disks that are capable of Direct Memory Access (eg. IDE) should be configured for it. When a disk says it has written data persistently, it must be so! No keeping it in cache and lying about it. I have been looking for information on how to ensure these are so for CENTOS and Ubuntu, but can't seem to find anything at all! I want to be able to check these things and change them if needed. Any and all input appreciated.

    Read the article

  • Installing PHP APC in Fedora - Unable to initialize module ?

    - by sri
    I have been trying to install APC on my Fedora Apache Server for showing progress bar while uploading files. But I am getting the following PHP Warning while starting XAMPP. Starting XAMPP for Linux 1.7.1... PHP Warning: PHP Startup: apc: Unable to initialize module Module compiled with module API=20090626, debug=0, thread-safety=0 PHP compiled with module API=20060613, debug=0, thread-safety=0 These options need to matchin Unknown on line 0 XAMPP: Starting Apache with SSL (and PHP5)... XAMPP: Starting MySQL... XAMPP: Another FTP daemon is already running. XAMPP for Linux started. My Server Details : OS : Fedora-12 XAMPP version : 1.7.1 PHP Version : 5.2.9 APC Version : 3.1.9 I have tried the process as is mentioned in here : 1)http://2bits.com/articles/installing-php-apc-gnulinux-centos-5.html 2)http://stevejenkins.com/blog/2011/08/how-to-install-apc-alternative-php-cache-on-centos-5-6/

    Read the article

  • Suddenly blocked from a site

    - by Diego Romero
    Suddenly from a time to now I haven't been able to go to a site I used to go frequently for maintenance (Wordpress). I tried different browsers, restarting my laptop, clearing cache, history, cookies. Also did a ping to the site ip, go 4 packets send and 4 lost. This is a problem I think with only my laptop, since I've been able to go into the site from other devices in the same network. I have also tried connecting to the same site from a completely different network with the same problem. I really don't know what to do about this, any advices? PS: site hosted in wp engine if that has anything to do with this problem.

    Read the article

  • Determining Performance Limits

    - by JeffV
    I have a number of windows processes that pass messages between them hat a high rate using tcp to local host. Aside from testing on actual hardware how can I assess what my hardware limit will be. These applications are not doing CPU intensive work, mostly decomposing and combining messages, scanning over them for special flag in the data etc.. The message size is typically 3k and the rate is typically ~10k messages per second. ~30MB per second between processing stages. There may be 10 or more stages depending. For this type of application, what should I look to for assessing performance? What do I look for in a server performance wise? I am currently running an XEON L5408 with 32 GB ram. But I am assuming cache is more important than actual ram size as I am barely touching the ram.

    Read the article

  • Can't set acl for afs even though i'm in the right group

    - by Torandi
    I've got a dir that i've lost controll over in an afs system. According to the system adminstrators my adminin subgroup (dsekt:admin) got rlidwka on the dir. I'm a member of this group (and i can list the members of the group and see my nick there) though I can't set the acl's for the dir. The pts: $pts membership dsekt:admin Members of dsekt:admin (id: -6813) are: /.../ taran And my klist. $>klist Credentials cache: FILE:/tmp/krb5cc_56782 Principal: [email protected] Both dsekt:admin and the dir is on the NADA.KTH.SE node.

    Read the article

  • Command-line HTTP crawler for Windows?

    - by Pekka
    Would somebody have a recommendation for a web site crawler that can be invoked and equipped with settings from the command line? This would need to run in a Windows environment. Saving the data, following stylesheet links etc. is not an issue. I only need the crawler to start with a page, parse it, and follow all the links on the same domain so that in the end, all pages on the site have been requested once. Background: I'm setting up a web site that gets frequently uploaded from an office location. Combining data from various sources, it has several levels of caching. I don't want the first user to visit the site after a fresh upload to have to wait until the page has been generated and saved in the cache.

    Read the article

  • Would a Socket Connection Outperform an Intarvaled Database Sweep and Requests?

    - by Jascha
    I'm building a small chat application to add to an existing framework. There will only be 20-50 users MAX at any one time. I was wondering if I could get away with updating a cache file containing (semi) live chat data for whichever users happen to be chatting just by performing timed queries and regular AJAX refreshes for new data as opposed to learning how to open and maintain a socket connection. I'm sure there are existing chat plug-ins out there. But I just had a hell of a time installing one and I could see building the whole damn thing taking just as much time as plugging one in. Am I off to a bad start? Thanks in advance -J (p.s. this is a semi closed network behind a php login so security isn't a great concern)

    Read the article

  • MySQL query very slow on Amazon RDS but really fast on my laptop?

    - by Luc
    I would love to know if anybody knows why this is happening. i've just migrated over to Amazon RDS for our website and our biggest query which takes .2 seconds to execute on my macbook takes 1.3 seconds to execute on the most expensive RDS instance. Obviously i've disabled query cache (and tested this) on my local computer and both databases are exactly the same. InnoDB, both have the same indexes etc. It's costing us a fortune ($2000 per month) for the fastest RDS instance and i'm losing faith quickly. any ideas?

    Read the article

  • Using wget to download pdf files from a site that requires cookies to be set

    - by matt74tm
    I want to access a newspaper site and then download their epaper copies (in PDF). The site requires me to login using my email address and password and then it permits me to access those PDF URLs. I'm having trouble 'setting my session' in wget. When I login into the site from my browser, it sets two cookie values: [email protected] Password=12345 I tried: wget --post-data "[email protected]&Password=12345" http://epaper.abc.com/login.aspx However, that just downloaded the login page and saved it locally The FORM on the login page has two fields: txtUserID txtPassword and radiobuttons like this: <input id="rbtnManchester" type="radio" checked="checked" name="txtpub" value="44"> Another button: <input id="rbtnLondon" type="radio" name="txtpub" value="64"> If I post this to the login.aspx page, I get the same output wget --post-data "[email protected]&txtPassword=12345&txtpub=44" http://epaper.abc.com/login.aspx If I do: --save-cookies abc_cookies.txt it doesnt seem to have anything other than the default content. For the last if I do --debug as well it says: ... Set-Cookie: ASP.NET_SessionId=05kphcn4hjmblq45qgnjoe41; path=/; HttpOnly ... Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId 05kphcn4hjmblq45qgnjoe41 Length: 107253 (105K) [text/html] Saving to: `login.aspx' ... Saving cookies to abc_cookies.txt. However, abc_cookies.txt shows ONLY the following: # HTTP cookie file. # Generated by Wget on 2011-08-16 08:03:05. # Edit at your own risk. (Not sure why I'm not getting any responses on SO - perhaps SU is a better forum - http://stackoverflow.com/questions/7064171/using-wget-to-download-pdf-files-from-a-site-that-requires-cookies-to-be-set) EDIT 1 C:\Temp>wget --cookies=on --keep-session-cookies --save-cookies abc_cookies.txt --post-data "txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7" http://epaper.abc.com/login.aspx --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. --2011-08-18 08:15:59-- http://epaper.abc.com/login.aspx Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00a2ae80 (new refcount 1). ---request begin--- POST /login.aspx HTTP/1.0 User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Content-Type: application/x-www-form-urlencoded Content-Length: 100 ---request end--- [POST data: txtUserID=abc%40gmail.com&txtPassword=password&txtpub=44&chkbox=checkbox&submit.x=48&submit.y=7] HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:17 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 Set-Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145; path=/; HttpOnly Cache-Control: private Content-Type: text/html; charset=utf-8 Content-Length: 107253 ---response end--- 200 OK Registered socket 300 for persistent reuse. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 Length: 107253 (105K) [text/html] Saving to: `login.aspx.1' 100%[======================================================================================================================>] 107,253 24.9K/s in 4.2s 2011-08-18 08:16:05 (24.9 KB/s) - `login.aspx.1' saved [107253/107253] Saving cookies to abc_cookies.txt. Done saving cookies. C:\Temp>wget --referer=http://epaper.abc.com/login.aspx --cookies=on --load-cookies abc_cookies.txt --keep-session-cookies --save-cookies abc_cookies.txt http://epaper.abc.com/PagePrint/16_08_2011_001.pdf --debug SYSTEM_WGETRC = c:/progra~1/wget/etc/wgetrc syswgetrc = C:\Program Files (x86)\GnuWin32/etc/wgetrc DEBUG output created by Wget 1.11.4 on Windows-MinGW. Stored cookie epaper.abc.com -1 (ANY) / <session> <insecure> [expiry none] ASP.NET_SessionId owcrje55yl45kgmhn43gq145 --2011-08-18 08:16:12-- http://epaper.abc.com/PagePrint/16_08_2011_001.pdf Resolving epaper.abc.com... seconds 0.00, 999.999.99.99 Caching epaper.abc.com => 999.999.99.99 Connecting to epaper.abc.com|999.999.99.99|:80... seconds 0.00, connected. Created socket 300. Releasing 0x00598290 (new refcount 1). ---request begin--- GET /PagePrint/16_08_2011_001.pdf HTTP/1.0 Referer: http://epaper.abc.com/login.aspx User-Agent: Wget/1.11.4 Accept: */* Host: epaper.abc.com Connection: Keep-Alive Cookie: ASP.NET_SessionId=owcrje55yl45kgmhn43gq145 ---request end--- HTTP request sent, awaiting response... ---response begin--- HTTP/1.1 200 OK Connection: keep-alive Date: Thu, 18 Aug 2011 02:46:30 GMT Server: Microsoft-IIS/6.0 X-Powered-By: ASP.NET X-AspNet-Version: 2.0.50727 content-disposition: attachement; filename=Default_logo.gif Cache-Control: private Content-Type: image/GIF Content-Length: 4568 ---response end--- 200 OK Registered socket 300 for persistent reuse. Length: 4568 (4.5K) [image/GIF] Saving to: `16_08_2011_001.pdf' 100%[======================================================================================================================>] 4,568 7.74K/s in 0.6s 2011-08-18 08:16:14 (7.74 KB/s) - `16_08_2011_001.pdf' saved [4568/4568] Saving cookies to abc_cookies.txt. Done saving cookies. Contents of abc_cookies.txt epaper.abc.com FALSE / FALSE 0 ASP.NET_SessionId owcrje55yl45kgmhn43gq145

    Read the article

  • Firefox 4 is displaying all text as "bold"

    - by Mateo
    I installed Firefox 4 this morning excited to checkout some of the new HTML 5 features that have been implemented and found that all text was displayed in a bold font. Here is the comparison with what Firefox 4 is displaying on my machine and Chrome. It is getting really annoying. So far I've... Emptied the cache/history/etc. Re-installed the various core fonts (Arial, Courier, Times New Roman, etc.) Checked the font defaults in FF4. The interesting part to me is that I had FF3.6x on this machine with no problems whatsoever and now this. Any ideas? System Description Windows 7 (x64) Update I just found out that this is happening in IE9 (final) as well. Must be something with the system fonts. Like I said before, this wasn't happening pre-FF4.

    Read the article

  • Deleting large no of files on linux eats up CPU

    - by Sanjay
    I generate more than 50GB of cache files on my RHEL server (and typical file size is 200kb so no of files is huge). When I try to delete these files it takes 8-10 hours. However, the bigger issue is that the system load goes to critical for these 8-10 hours. Is there anyway where I can keep the system load under control during the deletion. I tried using nice -n19 rm -rf * but that doesn't help in system load. P.S. I asked the same question on superuser.com but didn't get a good enough answer so trying here.

    Read the article

< Previous Page | 209 210 211 212 213 214 215 216 217 218 219 220  | Next Page >