Search Results

Search found 30046 results on 1202 pages for 'document load'.

Page 708/1202 | < Previous Page | 704 705 706 707 708 709 710 711 712 713 714 715  | Next Page >

  • High CPU Steal percentage on Amazon EC2 Instance

    - by Aditya Patawari
    I am experiencing high CPU steal percentage in a Amazon EC2 large instance. I know it means that my virtual CPU is waiting on the real CPU of the machine for time. My question is that what can I do to reduce this percentage and get maximum out of the CPU? Steal percentage is consistently at 20%. System load crosses 10 when this happens. I have checked memory and network and I am sure that they are not the bottleneck. Is that normal for such environment? Also are there any system level optimization techniques for reducing steal percentage form the virtual instance? avg-cpu: %user %nice %system %iowait %steal %idle 52.38 0.00 8.23 0.00 21.21 18.18

    Read the article

  • Javascript Module pattern with DOM ready

    - by dego89
    I am writing a JS Module pattern to test out code and help me understand the pattern, using a JS Fiddle. What I can't figure out is why my "private methods" on line 25 and 26, when referenced via DOM ready, have a value of undefined. JSFiddle Code Sample: var obj = { key: "value" }; var Module = (function () { var innerVar = "5"; console.log("obj var in Module:"); console.log(obj); function privateFunction() { console.log("privateFunction() called."); innerFunction(); function innerFunction() { console.log("inner function of (private function) called."); } } function _numTwo() { console.log("_numTwo() function called."); } return { test: privateFunction, numTwo: _numTwo } }(obj)); $(document).ready(function () { console.log("$ Dom Ready"); console.log("Module in Dom Ready: "); console.log(Module.test()); });

    Read the article

  • 301 redirect from HTTP to HTTPS - how to be sure Google is fetching the correct information?

    - by user33692
    I'm hoping somebody might be able to provide a bit of advice on an issue I am having. I have one site where we implemented a 301 redirect on the homepage from HTTP to HTTPS. We have links on the homepage to other parts of the site that are not under SSL (in fact there is only one other page under SSL). When I go to our Webmaster Tools account I notice that we are not being provided with any webmaster information (e.g., search queries, backlinks, etc...) related to our homepage under SSL. I performed a Fetch as Google on the homepage and the information it returned is: HTTP/1.1 301 Moved Permanently Date: Fri, 08 Nov 2013 17:26:24 GMT Server: Apache/2.2.16 (Debian) Location: https://mysite.com/ Vary: Accept-Encoding Content-Encoding: gzip Content-Length: 242 Keep-Alive: timeout=15, max=100 Connection: Keep-Alive Content-Type: text/html; charset=iso-8859-1 <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN"> <html><head> <title>301 Moved Permanently</title> </head><body> <h1>Moved Permanently</h1> <p>The document has moved <a href="https://mysite.com/">here</a>.</p> <hr> <address>Apache/2.2.16 (Debian) Server at mysite.com</address> </body></html> I am worried by the fact that Google fetch is not getting the correct Title tags and Meta information from our homepage and that this is hurting our search results. Additionally, I am worried that we need to do something specific with the sitemap to ensure that Google is correctly indexing all our pages and being able to flow from the HTTPS to the HTTP without issues. Does anybody have any advice on how we can correctly set this up or be sure that Google is fetching the correct information?

    Read the article

  • Dual boot windows 8 pro and windows 7 on XPS 8500 Special additon

    - by Jesse
    I am trying to install a dual boot with windows 7 premium and windows 8 Pro on an XPS 8500 special edition. I created a new primary partition on my C: drive, inserted the windows 8 install disk, and rebooted my computer from DVD. I select custom install and the dialog box saying where do you want to install windows at? pops up but none of my drives are listed. Please help me determine what is going on? I don't understand why none of my drives are showing up on this menu. Not even the original drive. When I go to load driver and click on the partition I created it tells me "No signed device drivers were found. Make sure the installation media contains the correct drivers, and then click OK."

    Read the article

  • How would I measure the amount of RAM needed per Glassfish domain? [closed]

    - by oligofren
    Possible Duplicate: Can you help me with my capacity planning? In our test environment we have a lot of apps spread out over a few servers and Glassfish domains. To make versioning easier I would have liked to have one Glassfish domain per customer per app (kind of like a heavyweight version of lots of jetty instances). But I have heard that Glassfish is kind of heavy on the resources, and so I would need to measure approximately how many instances would fit in the available RAM. These are low-traffic/low load testing servers, so CPU is not really an issue, though RAM might be. How would I get an approximate measure of how much RAM is needed? This is one Glassfish 3 instance with one heavy EAR application deployed. top? jvmstats? ??

    Read the article

  • Can't type text into table cell when form is protected in Word 2002

    - by Gus
    I created several tables in an Word 2002 document to act as a form for users to fill out. In some cells I added check boxes and dropdown lists for people to select options from. When I click on the Protect form button the check boxes and dropdowns become active; however, I can't type anything in the other cells. If I click in an empty cell it automatically moves to the next checkbox. When I unprotect the form then I can type in the empty cells but then checkboxes and dropdowns become useless. I know that the user can double click on the checkbox and then manually select to have it checked, but they can't do this with the dropdowns. What do I have to do to allow the user to type in responses to the tables cells and also select answers from checkboxes and dropdowns? Thanks!

    Read the article

  • App to slice'n'dice video, specifically remove chunks, on a Mac?

    - by Phillip Oldham
    I have a couple of collections of DVD Box-Sets I've ripped to my mac. Now I'd like to sweeten the viewing experience by removing the title sequences and credits so that viewing doesn't mean I have to keep reaching for the remote to skip 30 seconds of annoying music (think watching multiple episodes of Family Guy). If I can find an app that will let me do this reasonably quickly manually that would be great, but it would be perfect if I could dump a load of commands into a file and have everything trimmed while the mac is "inactive". I'm thinking that if I can specify chunks of time to remove from the original file that would be perfect. I had a quick look at importing into iMovie to do it manually and gave up at the "Processing Thumbnails" stage as it said it would be a couple of hours to produce them for a 45min mp4 file, which I can understand at 25fps but I'm not willing to wait, especially when I've got over a week's worth of files. Any suggestions?

    Read the article

  • How to create repeatable table with unique ID's using jQuery

    - by milbert
    I need to create a table structure that can be "copied" and populated with a new set of data. However, each table must have unique IDs for functions that must access them later. For example: <table class="main"> <thead><tr><th class="header"></th></tr></thead> <tbody> <tr class="row"><td class="col0"></td><td class="col1"></td></tr> </tbody> </table> My current thought is to use jQuery to load the table from a seperate html file into a variable. Using this saved table I could then create a function that copies it, traverses the table to add an ID to each section where information will need to be appended from a seperate data source, and return this new table. I am new to jQuery and feel like I may be missing an easier/better way to accomplish this. Any help on this subject would be appreciated.

    Read the article

  • PHP session files have permissions of 000 - They're unusable

    - by vanced
    I kept having issues with a Document Management System I'm trying to install as, at the first step of the installation process, it would error with: Warning: Unknown: open(/tmp/sess_d39cac7f80834b2ee069d0c867ac169c, O_RDWR) failed: Permission denied (13) in Unknown on line 0 Warning: Unknown: Failed to write session data (files). Please verify that the current setting of session.save_path is correct (/tmp) in Unknown on line 0 I looked in /tmp and saw the sess_* files have the following permissions ---------- 1 vanced vanced 1240 Jan 20 08:48 sess_d39cac7f80834b2ee069d0c867ac169c All the session files look like this. So obviously, they're unusable by PHP and it's causing me lots of problems. How can I get PHP to set the correct permissions? I've tried changing the directory which php.ini uses to /tmp/phpsessions and the same thing occurs. The directories are a+rwx.

    Read the article

  • Windows 7 file explorer preview window and password protected word documents

    - by Carbonara
    When using the Windows 7 Explorer with the preview pane open you get a little preview of a file when you click on it. This includes Word, Excel spreadsheets, etc. My problem is if the Word document is password protected. Clicking on it in Explorer automatically asks for the password to display its preview. It does this if you single or double clicking on it. You then get an empty Word instance running (which allows it to display the preview) and another instance of Word with your actual file and you're asked for the password twice in total. This is annoying and untidy. Is there a way of stopping the preview pane from wanting to display password protected documents and thus not asking for the password to display a preview?

    Read the article

  • Need solutions in sharing a 3Mb/768Kbps DSL line to 60+ users and faster bandwidth

    - by elistp
    Two parts. Part 1: We currently have 2 DSL Lines with 3Mb/768Kbps speeds load balanced for 60+ users. Accessing the Internet is borderline unusable. The simple solution would be to get a faster DSL Line but the highest DSL package is 6Mb/768Kbps, has quite the price jump, and doesn't do anything to help with upload speeds. I'm looking for free or extremely low cost solutions (web cache, traffic shaping, bandwidth controls, etc) to help with making Internet access more bearable until the next funding year. Can anyone give any advice? Part 2: We're looking into a 4.5Mb bonded T1 in the next funding year which is of course significantly more expensive than 2 DSL lines. Are bonded T1s our only hope for faster speeds? Are there any better alternatives?

    Read the article

  • Ubuntu 12.04 takes too long to boot

    - by msPeachy
    I've recently encountered the following error message: mount: mounting /dev/disk/by-uuid/3f7f5cd9d-6ea3-4da7-b5ec-**** on /root failed: Invalid argument mount: mounting /sys on /root/sys failed: No such file or directory mount: mounting /dev on /root/dev failed: No such file or directory mount: mounting /sys on /root/sys failed: No such file or directory mount: mounting /proc on /root/proc failed: No such file or directory Target file system doesn't have /sbin/init. No init found. Try passing init= bootarg. Busybox v1.18.5 (Ubuntu 1:1.18.5-1ubuntu4) built-in shell (ash) Enter 'help' for a list of built-in commands. (initramfs) _ I run sudo fsck /dev/sda2 which is the Ubuntu ext4 root partition via LiveCD. It checked and fixed the file system. The next time I boot, Ubuntu started to load with the Ubuntu logo and the dots underneath for several hours (with the mouse pointer active on the screen), I even let the computer on overnight but still it did not successfully boot or got to the login screen in the morning. I booted again with the LiveCD and checked the NTFS partitions with ntfsfix and again the NTFS partitions was checked and fixed successfully. I also edited my fstab and commented out the lines that auto-mounts the NTFS partitions. The next time I boot, it took almost 20 minutes for Ubuntu to get to the login screen, after typing the password it took an additional 10 minutes for Ubuntu to get to the desktop. On the desktop, it take several minutes to open any program, displaying the Dash alone takes 5 minutes! Is there a fix for this without having to reinstall Ubuntu? I don't see or get any errors, Ubuntu is just taking too long to boot and to run programs. Please help!

    Read the article

  • How to allow a single domain name with iptables

    - by Claw
    I am looking for a way to make iptables only accept requests for my domain name and reject the others. Lately I misconfigured my apache proxy, it is now fixed, but I keep receiving a load of requests looking like that : xxxx.xx:80 142.54.184.226 - - [12/Sep/2012:15:25:14 +0200] "GET http://ad.bharatstudent.com/st?ad_type=iframe&ad_size=700x300&section=3011105&pub_url=${PUB_URL} HTTP/1.0" 200 4985 "http://www.gethealthbank.com/category/medicine/" "Mozilla/4.0 (compatible; MSIE 5.5; Windows NT 4.0)" xxxx.xx:80 199.116.113.149 - - [12/Sep/2012:15:25:14 +0200] "GET http://mobile1.login.vip.ird.yahoo.com/config/pwtoken_get?login=heaven_12_&src=ntverifyint&passwd=7698ca276acaf6070487899ad2ee2cb9&challenge=wTBYIo2AEdMFr6LtdyQZPqYw9FS9&md5=1 HTTP/1.0" 200 425 "-" "MobileRunner-J2ME" which I would like to block. How can I manage this ?

    Read the article

  • What are the common techniques to handle user-generated HTML modified differently by different browsers?

    - by Jakie
    I am developing a website updater. The front end uses HTML, CSS and JavaScript, and the backend uses Python. The way it works is that <p/>, <b/> and some other HTML elements can be updated by the user. To enable this, I load the webpage and, with JQuery, convert all those elements to <textarea/> elements. Once they the content of the text area is changed, I apply the change to the original elements and send it to a Python script to store the new content. The problem is that I'm finding that different browsers change the original HTML. How do you get around this issue? What Python libraries do you use? What techniques or application designs do you use to avoid or overcome this issue? The problems I found are: IE removes the quotes around class and id attributes. For example, <img class='abc'/> becomes <img class=abc/>. Firefox removes the backslash from the line breaks: <br \> becomes <br>. Some websites have very specific display technicalities, so an insertion of a simple "\n"(which IE does) can affect the display of a website. Example: changing <img class='headingpic' /><div id="maincontent"> to <img class='headingpic'/>\n <div id="maincontent"> inserts a vertical gap in IE. The things I have unsuccessfully tried to overcome these issues: Using either JQuery or Python to remove all >\n< occurences, <br> etc. But this fails because I get different patterns in IE, sometimes a ·\n, sometimes a \n···. In a Python, parse the new HTML, extract the new text/content, insert it into the old HTML so the elements and format never change, just the content. This is very difficult and seems to be overkill.

    Read the article

  • Dark NetBeans

    - by Geertjan
    Let's make NetBeans IDE look like this. Not saying it's a nice color or anything, just that it's possible to do so: I changed the coloring in the Java editor by going to Tools | Options, then chose "Fonts & Colors", then selected the "Norway Today" profile and changed the background setting to Dark Gray. Next, I put this themes.xml file into the "config" folder of the NetBeans IDE user directory, which you can identify as such by going to Help | About in the IDE. Go to the exact location defined by "User directory" in Help | About, and then go to the "config" folder within that folder: The "config" folder of the user directory is the readable/writable root of the NetBeans IDE virtual filesystem. If a themes.xml file is found there, it is used, as described here. Then, in netbeans.conf file, which is not in the NetBeans user directory but in the NetBeans installation directory, within its "etc" folder, I added the following to "netbeans_default_options": -J-Dnetbeans.useTheme=true --laf Metal The first of these enables usage of the themes.xml file, i.e., it notifies NetBeans IDE at startup to load the themes.xml file and to apply the content to the relevant UI components, while the second is needed because most/all of the themes only work if you're using the Metal Look and Feel. Note: I must add that in most cases, whatever it is you're trying to achieve via a themes.xml file can probably be achieved in a different, and better, way. The themes.xml mechanism has been there forever, but is not actively supported or tested, though it may work for the specific thing you're trying to do anyway. For example, if you're trying to change the background color of a TopComponent, use the paintComponent method of the TopComponent instead of using a themes.xml file.

    Read the article

  • Dynamically changing one-node Cassandra cluster to two nodes

    - by Jason Axelson
    So I have an application that will be very dormant most of the time but will need high-bursting a few days out of the month. Since we are deploying on EC2 I would like to keep only one Cassandra server up most of the time and then on burst days I want to bring one more server up (with more RAM and CPU than the first) to help serve the load. What is the best way to do this? Should I take a different approach? Some notes about what I plan to do: Bring the node up and repair it immediately After the burst time is over decommission the powerful node Use the always-on server as the seed node My main question is how to get the nodes to share all the data since I want a replication factor of 2 (so both nodes have all the data) but that won't work while there is only one server. Should I bring up 2 extra servers instead of just one?

    Read the article

  • Real-time Image Resize, Cropping and Caching Server Product

    - by Elijah
    I'm investigating what products are out there that will allow you to request images through a HTTP API in arbitrary image sizes. The server would behind a CDN but would still need to be able to handle a fair bit of traffic and be possibly load-balanced. I've been tasked with writing such a service, but I wanted to do some due diligence to see what commercial or open source solutions are out there. Google has not been particularly helpful. It may be because I have been searching for the wrong term. Third-party sites and services are out of the question because of corporate policies.

    Read the article

  • JEditorPane Code Completion

    - by Geertjan
    Code completion in a JEditorPane: Unfortunately, a lot of this solution depends on the Java Editor support in the IDE. Therefore, to use it, in its current state, you'll need lots of Java Editor related JARs even though your own application probably doesn't include a Java Editor. A key thing one needs to do is implement the NetBeans Code Completion API, using the related tutorial in the NetBeans Platform Learning Trail, but register the CompletionProvider as follows: @MimeRegistration(mimeType = "text/x-dialog-binding", service = CompletionProvider.class) Then in the TopComponent, include this code, which will bind all the completion providers in the above location, i.e., text/x-dialog-binding, to the JEditorPane: EditorKit kit = CloneableEditorSupport.getEditorKit("text/x-java"); jEditorPane1.setEditorKit(kit); FileObject fob; try {     fob = FileUtil.getConfigRoot().createData("tmp.java");     DataObject dob = DataObject.find(fob);     jEditorPane1.getDocument().putProperty(             Document.StreamDescriptionProperty,             dob);     DialogBinding.bindComponentToFile(fob, 0, 0, jEditorPane1);     jEditorPane1.setText("Egypt"); } catch (IOException ex) {     Exceptions.printStackTrace(ex); } Not a perfect solution, a bit hacky, with a high overheard, but a start nonetheless. Someone should look in the NetBeans sources to see how this actually works and then create a generic solution that is not tied to the Java Editor.

    Read the article

  • .htaccess URL rewrite to multi-parameter item

    - by MrCS
    I just spent the last 10 hours of my life on this & am running in circles, so was hoping someone may be able to help me. I want a specific URL to load like this: http://example.com/f/2011/image.png?attribute=small When a URL in a format such as this hits, I'd like to rewrite it to hit the server as: http://example.com/generate.php?f=2011/image.png&attribute=small Based on above, my question is two-fold: How can I rewrite the URL in htaccess to meet my requirements above? If the original URL didn't have the attribute query string parameter, how can I ensure attribute will be false/blank/etc when it hits the server via htaccess?

    Read the article

  • dig gets the right result from DNS server, but name still fails to resolve

    - by EMiller
    Under what conditions would the following occur? From a given OSX machine on an internal network: $~ cat /etc/resolv.conf nameserver 10.102.120.7 nameserver 10.102.120.2 From the same machine: $~ dig @10.102.120.7 in.local <snip> ... ;; QUESTION SECTION: ;in.local. IN A ;; ANSWER SECTION: in.local. 43200 IN A 10.102.123.30 <snip> ... And yet, this workstation cannot ping in.local, nor load pages hosted by apache on that machine. 10.102.123.30 is definitley up (2 OSX machines I know fail to resolve in.local - but other machines on the network can). I have also checked their /etc/hosts to see if anything there might interfere... Not sure what else to check...

    Read the article

  • How to block this URL pattern in Varnish VCL?

    - by iTech
    My website is getting badly hit by spambots and scrappers, I am using Cloudflare but the problem still remains there. The problem is spambots accessing non-existing urls causing a lot of load to my drupal backend which goes all the way and bootstraps db just to serve a 404 error doc. I cant simply dish out non-drupal 404's for all page not found errors, as I need to have drupal catch them. Since, varnish is in front it can check if the bot is acting nice and asking for valid url - if not it servers them a 404 or 403. These bots are causing errors using this pattern : http://www.megaleecher.net/http:/www.megaleecher.net/Using_iPhone_As_USB_Mass_S/Using_iPhone_As_USB_Mass_S/Using_iPhone_As_USB_Mass_S/Using_iPhone_As_USB_Mass_S/Using_iPhone_As_USB_Mass_S/Using_iPhone_As_USB_Mass_S/Using_iPhone_As_USB_Mass_S/Using_iPhone_As_USB_Mass_Storage Now, pls. suggest a regex varnbisg VCL directive which catches this URL pattern and serves a 404 error from varnish, preventing it from reaching apache/drupal ?

    Read the article

  • Hosting and domain registrations for multiple clients

    - by letseatfood
    I am finally getting regular work desiging, developing, and deploying websites for small businesses and individuals. So far the websites utilize single-user content management systems, so the websites create, as far as I know, minimal load on the shared servers. I have always required that each of my clients purchase annual shared hosting at Dreamhost. For domain registration, I ask that they register with Dreamhost, but some already have a registered domain elsewhere and this is fine with me. I do this so the billing issues are the client's responsibility, not mine. My question is: Since I can register unlimited domains and connect them to my one shared hosting account at Dreamhost, should I not be requiring clients to individually pay for shared hosting and a domain? Should I actually be paying for one hosting account and then hosting all of my client's websites on that account? As I said before, I currently have each client buy their own hosting, because I feel that, for example, if there is high traffic to their site, there would be less a chance of the site going down than if their site was hosted with many others on one account. I am famous for being long-winded, please let me know if I can clarify at all. Thanks!

    Read the article

  • Transfer hard-drive with windows XP to another computer. On booting, asks to activate xp

    - by Jesse
    I had an old computer sitting around that I have not been able to boot successfully. I moved the hard drive and placed it in my newer computer. If I boot linux, I can mount the XP hard-drive and access the files. If I try to boot from the XP hard-drive, it will boot, but it asks me to activate windows before proceeding. If I continue, I get the "activation window" with two images/icons(?) which are failing to load. Nothing else happens. The version of windows came with the original computer the hard-drive came from, so I'm not sure if I'm married to the broken computer (I hope not!). Is there anything I can do in order to boot into XP from the new computer?

    Read the article

  • Influence Maps for Pathfinding?

    - by james
    I'm taking the plunge and am getting into game dev, it's been going well but I've got stuck on a problem. I have a maze that is 100x100 with 0,1 to indicate if its a path or a wall. Within the maze I have 300 or so enemies and a player. The outcome I'm looking for is all the enemies work their way towards the player position. Originally I did this using an A* path finding algorithm but with 300 enemies it was taking forever to path find each one individually. After some research I found that an influence map / collaborative diffusion would be the best way to go. But I'm having a real hard time working out how this is actually done. Firstly.. How do you create a influence map? From what I understand each of my walls with have a scent of 0 so that makes them impassable.. then basically a radial effect from my player position to each other cell (So my player starts at 100 and then going outwards from that each other cell will be reduced value) Is that correct? If so,.. How would you do that (Math magic?) My next problem is if that is correct how would my "enemies" stop from getting stuck if they have gone down the wrong way? As say if my player was standing on the otherside of a wall if the enemy is just looking for larger numbers wont it keep getting stuck? I'm doing this in JavaScript so performance is key. Thanks for any help! EDIT: Or if anyones got a better solution? I've been reading about navmeshs, steering pathing, pre calculating all paths on load etc etc

    Read the article

  • Windows server response time very high

    - by Nagaraju Bandla
    Server Specs Windows Server 2008 R2 64 bit Provider : Fasthosts .Net Framework: 4.0 6 GB RAM (its using 4.6 GB) i have a website with thousands of pages structured like folderone/1/one to 500.aspx folderone/2/one to 500.aspx . . folderone/500/one to 500.aspx To load this pages for the first time after the release, for each folder it takes about 20 to 30 minutes and once one page is loaded the rest of the pages loads fine. This happens for all folders. And this repeats every time i restart the server, when a added anything to app_code or if i change the web.config. My site is mainly works Google and due to this problem its giving errors. Any help will be highly appreciated please. i am happy to buy a beer for you if its resolved. Thanks in advance...

    Read the article

< Previous Page | 704 705 706 707 708 709 710 711 712 713 714 715  | Next Page >