Search Results

Search found 80052 results on 3203 pages for 'data load performance'.

Page 196/3203 | < Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >

  • Single web app, multiple web servers

    - by Ramakrishna
    I have a problem of load balancing. We developed a web app for nearly 1500 users. As the number of users increased we are unable to serve the requests in a timely manner. It takes around 10 to 20 seconds to load a page. Under heavy load it can take one minute to serve the page. We need to solve this situation so that each request is served in 2 or 3 seconds. App develped in : asp.net Hosted in : IIS 7.5 Machine configuration : Windows Server 2008, 8GB RAM, 1Mbps bandwidth

    Read the article

  • PHP class data implementation

    - by Bakanyaka
    I'm studying OOP PHP and have watched two tutorials that implement user login\registration system as an example. But implementation varies. Which way will be more correct one to work with data such as this? Load all data retrieved from database as array into a property called something like _data on class creation and further methods operate with this property Create separate properties for each field retrieved from database, on class creation load all data fields into respective properties and operate with that properties separately? Then let's say I want to create a method that returns a list of all users with their data. Which way is better? Method that returns just an array of userdata like this: Array([0]=>array([id] => 1, [username] => 'John', ...), [1]=>array([id] => 2, [username] => 'Jack', ...), ...) Method that creates a new instance of it's class for each user and returns an array of objects

    Read the article

  • Google Analytics data Missing for 4 days

    - by Shumaila
    I used wrong tracking id in my E-commerce tracking code for Google analytics and missed the data for few days. To add that missing data in my account I have written a short script which manually send all orders data for four days to GA account but what my concern is date : Those orders which already placed on different dates and , when if I run my script and so it will send my missing data with current date , which I do not want. ( I want to send date when that order is actually placed) do anyone help me with this ? I am really much stuck with my work here.

    Read the article

  • CellC data card not recognized if I boot with it in

    - by Armien
    In 11.10 I find that I can connect to the internet via broadband connection successfully. The problem is that I cannot have the data card attached to the machine while the machine boots up. If I leave the data card in the machine during startup, the data card is not picked up and I then cannot connect to the internet. I must first boot up my machine, login, attach the data card into the usb port, wait so 30 seconds. The broadband connection name will then appear in the network dropdown at the top of the screen. An internet connect is now possible via broadband. Please let me know what must be done to fix this.

    Read the article

  • TSQL Challenge 83 - Compare rows in the same table and group the data

    The challenge is to compare the data of the rows and group the input data. The data needs to be grouped based on the Product ID, Date, TotalLines, LinesOutOfService. NEW! Deployment Manager Early Access ReleaseDeploy SQL Server changes and .NET applications fast, frequently, and without fuss, using Deployment Manager, the new tool from Red Gate. Try the Early Access Release to get a 20% discount on Version 1. Download the Early Access Release.

    Read the article

  • Using PDO for Data Management

    - by edorahg
    This question is more a design oriented question than a code specific question. I am new to PHP and I am planning to use PDO as a data access layer. Say for instance I have a class called CITY. Now if I need to create an instance of this class, what is the best technique. Should have a singleton DB access class which is used to write and read data from the db layer. OR should I delegate it to the individual class object. For example if I invoke city.save() (city is a class), then the city class will handle the saving of that city object's data into the database. Excuse my ignorance but i have a java background and therefore trying to understand what is the best design principle for data management when using php.

    Read the article

  • Does low latency code sometimes have to be "ugly"?

    - by user997112
    (This is mainly aimed at those who have specific knowledge of low latency systems, to avoid people just answering with unsubstantiated opinions). Do you feel there is a trade-off between writing "nice" object orientated code and writing very fast low latency code? For instance, avoiding virtual functions in C++/the overhead of polymorphism etc- re-writing code which looks nasty, but is very fast etc? It stands to reason- who cares if it looks ugly (so long as its maintainable)- if you need speed, you need speed? I would be interested to hear from people who have worked in such areas.

    Read the article

  • Can't view order in magento

    - by koko
    Hi, I've been setting up a fresh magento 1.4.0.1 install, working great so far. I did some test orders just to see. Everything works fine, but when I click on "view order" under "my orders", I get a bunch of error messages: There has been an error processing your request Notice: iconv_substr() [function.iconv-substr]: Unknown error (0) in /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Helper/String.php on line 98 Trace: #0 [internal function]: mageCoreErrorHandler(8, 'iconv_substr() ...', '/data/web/A1423...', 98, Array) #1 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Helper/String.php(98): iconv_substr('1', 0, 50, 'UTF-8') #2 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Helper/String.php(173): Mage_Core_Helper_String-substr('1', 0, 50) #3 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Helper/String.php(112): Mage_Core_Helper_String-str_split('1', 50) #4 /data/web/A14237/htdocs/magento/app/design/frontend/base/default/template/sales/order/items/renderer/default.phtml(58): Mage_Core_Helper_String-splitInjection('1') #5 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(189): include('/data/web/A1423...') #6 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(225): Mage_Core_Block_Template-fetchView('frontend/base/d...') #7 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(242): Mage_Core_Block_Template-renderView() #8 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(674): Mage_Core_Block_Template-_toHtml() #9 /data/web/A14237/htdocs/magento/app/code/core/Mage/Sales/Block/Items/Abstract.php(137): Mage_Core_Block_Abstract-toHtml() #10 /data/web/A14237/htdocs/magento/app/design/frontend/base/default/template/sales/order/items.phtml(52): Mage_Sales_Block_Items_Abstract-getItemHtml(Object(Mage_Sales_Model_Order_Item)) #11 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(189): include('/data/web/A1423...') #12 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(225): Mage_Core_Block_Template-fetchView('frontend/base/d...') #13 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(242): Mage_Core_Block_Template-renderView() #14 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(674): Mage_Core_Block_Template-_toHtml() #15 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(516): Mage_Core_Block_Abstract-toHtml() #16 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(467): Mage_Core_Block_Abstract-_getChildHtml('order_items', true) #17 /data/web/A14237/htdocs/magento/app/design/frontend/base/default/template/sales/order/view.phtml(64): Mage_Core_Block_Abstract-getChildHtml('order_items') #18 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(189): include('/data/web/A1423...') #19 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(225): Mage_Core_Block_Template-fetchView('frontend/base/d...') #20 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(242): Mage_Core_Block_Template-renderView() #21 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(674): Mage_Core_Block_Template-_toHtml() #22 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(516): Mage_Core_Block_Abstract-toHtml() #23 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(463): Mage_Core_Block_Abstract-_getChildHtml('sales.order.vie...', true) #24 /data/web/A14237/htdocs/magento/app/code/core/Mage/Page/Block/Html/Wrapper.php(52): Mage_Core_Block_Abstract-getChildHtml('', true, true) #25 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(674): Mage_Page_Block_Html_Wrapper-_toHtml() #26 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Text/List.php(43): Mage_Core_Block_Abstract-toHtml() #27 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(674): Mage_Core_Block_Text_List-_toHtml() #28 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(516): Mage_Core_Block_Abstract-toHtml() #29 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(467): Mage_Core_Block_Abstract-_getChildHtml('content', true) #30 /data/web/A14237/htdocs/magento/app/design/frontend/base/default/template/page/2columns-left.phtml(48): Mage_Core_Block_Abstract-getChildHtml('content') #31 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(189): include('/data/web/A1423...') #32 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(225): Mage_Core_Block_Template-fetchView('frontend/base/d...') #33 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Template.php(242): Mage_Core_Block_Template-renderView() #34 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Block/Abstract.php(674): Mage_Core_Block_Template-_toHtml() #35 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Model/Layout.php(536): Mage_Core_Block_Abstract-toHtml() #36 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Controller/Varien/Action.php(389): Mage_Core_Model_Layout-getOutput() #37 /data/web/A14237/htdocs/magento/app/code/core/Mage/Sales/controllers/OrderController.php(100): Mage_Core_Controller_Varien_Action-renderLayout() #38 /data/web/A14237/htdocs/magento/app/code/core/Mage/Sales/controllers/OrderController.php(136): Mage_Sales_OrderController-_viewAction() #39 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Controller/Varien/Action.php(418): Mage_Sales_OrderController-viewAction() #40 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Controller/Varien/Router/Standard.php(254): Mage_Core_Controller_Varien_Action-dispatch('view') #41 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Controller/Varien/Front.php(177): Mage_Core_Controller_Varien_Router_Standard-match(Object(Mage_Core_Controller_Request_Http)) #42 /data/web/A14237/htdocs/magento/app/code/core/Mage/Core/Model/App.php(304): Mage_Core_Controller_Varien_Front-dispatch() #43 /data/web/A14237/htdocs/magento/app/Mage.php(596): Mage_Core_Model_App-run(Array) #44 /data/web/A14237/htdocs/magento/index.php(78): Mage::run('', 'store') #45 {main} gtx, koko

    Read the article

  • Load some data from database and hide it somewhere in a web page

    - by kwokwai
    Hi all, I am trying to load some data (which may be up to a few thousands words) from the database, and store the data somewhere in a html web page for comparing the data input by users. I am thinking to load the data to a Textarea under Div tag and hide the the data: <Div id="reference" style="Display:none;"> <textarea rows="2" cols="20" id="database"> html, htm, php, asp, jsp, aspx, ctp, thtml, xml, xsl... </textarea> </Div> <table border=0 width="100%"> <tr> <td>Username</td> <td> <div id="username"> <input type="text" name="data" id="data"> </div> </td> </tr> </table> <script> $(document).ready(function(){ //comparing the data loaded from database with the user's input if($("#data").val()==$("#database").val()) {alert("error");} }); </script> I am not sure if this is the best way to do it, so could you give me some advice and suggest your methods please.

    Read the article

  • How to load object after saving with encodeWithCoder?

    - by fuzzygoat
    EDIT_002: Further rewrite: if I save using the method below how would the method to load it back in look? (moons is an NSMutableArray of NSNumbers) // ------------------------------------------------------------------- ** // METHOD_002 // ------------------------------------------------------------------- ** -(void)saveMoons:(NSString *)savePath { NSMutableData *data = [[NSMutableData alloc] init]; NSKeyedArchiver *archiver = [[NSKeyedArchiver alloc] initForWritingWithMutableData:data]; [moons encodeWithCoder:archiver]; [archiver finishEncoding]; [data writeToFile:savePath atomically:YES]; [archiver release]; [data release]; } EDIT_003: Found it, my problem was that I was using ... [moons encodeWithCoder:archiver]; where I should have been using ... [archiver encodeObject:moons]; Hence the loader would look like: -(void)loadMoons_V3:(NSString *)loadPath { NSData *data = [[NSData alloc] initWithContentsOfFile:loadPath]; NSKeyedUnarchiver *unarchiver = [[NSKeyedUnarchiver alloc] initForReadingWithData:data]; [self setMoons:[unarchiver decodeObject]]; [unarchiver finishDecoding]; [unarchiver release]; [data release]; } gary

    Read the article

  • Huge performance difference between two web servers, odd behavior seen using process monitor

    - by Francis Gagnon
    We have two Coldfusion servers that have a huge performance difference running the exact same code on the exact same input data. The code in questions instantiates a large amount of CFCs (Coldfusion Components, which are similar to objects in OOP languages). I compared the two servers by running Process Monitor and then calling the problematic code on both machines. I learned two things. First, Coldfusion opens CFC files every time it instantiates an object. Both servers do this, so it cannot be the cause of the performance difference. Second, the fast server opens the CFC files directly while the server with the performance problem seems to navigate its way through the path until it reaches the desired CFC file. It does this for every file, even the ones it has previously loaded, and because the code instantiates so many CFCs it becomes very slow. See below the partial Promon traces that show this behavior. It can take over 60 seconds for the slow server to do what the fast one does in 2 seconds. Can anyone tell me what causes this behavior? Is it a Coldfusion setting? Since Coldfusion runs on top of Java, is it a Java setting? Is it an OS option? The fast server is running Windows XP and I think the slow server is a Windows Server 2003. Bonus question: Coldfusion doesn't seem to perform any READ FILE operations on any of the CFC or CFM files. How can this be? Sample of the fast server opening CFC files: 11:25:14.5588975 jrun.exe QueryOpen C:\CF\wwwroot\APP\com\HtmlUtils.cfc 11:25:14.5592758 jrun.exe CreateFile C:\CF\wwwroot\APP\com\HtmlUtils.cfc 11:25:14.5595024 jrun.exe QueryBasicInformationFile C:\CF\wwwroot\APP\com\HtmlUtils.cfc 11:25:14.5595940 jrun.exe CloseFile C:\CF\wwwroot\APP\com\HtmlUtils.cfc 11:25:14.5599628 jrun.exe CreateFile C:\CF\wwwroot\APP\com\HtmlUtils.cfc 11:25:14.5601600 jrun.exe QueryBasicInformationFile C:\CF\wwwroot\APP\com\HtmlUtils.cfc 11:25:14.5602463 jrun.exe CloseFile C:\CF\wwwroot\APP\com\HtmlUtils.cfc Equivalent sample of the slow server opening CFC files: 11:15:08.1249230 jrun.exe CreateFile D:\ 11:15:08.1250100 jrun.exe QueryDirectory D:\org 11:15:08.1252852 jrun.exe CloseFile D:\ 11:15:08.1259670 jrun.exe CreateFile D:\org 11:15:08.1260319 jrun.exe QueryDirectory D:\org\cli 11:15:08.1260769 jrun.exe CloseFile D:\org 11:15:08.1269451 jrun.exe CreateFile D:\org\cli 11:15:08.1270613 jrun.exe QueryDirectory D:\org\cli\cpn 11:15:08.1271140 jrun.exe CloseFile D:\org\cli 11:15:08.1279312 jrun.exe CreateFile D:\org\cli\cpn 11:15:08.1280086 jrun.exe QueryDirectory D:\org\cli\cpn\APP 11:15:08.1280789 jrun.exe CloseFile D:\org\cli\cpn 11:15:08.1291034 jrun.exe CreateFile D:\org\cli\cpn\APP 11:15:08.1291709 jrun.exe QueryDirectory D:\org\cli\cpn\APP\com 11:15:08.1292224 jrun.exe CloseFile D:\org\cli\cpn\APP 11:15:08.1300568 jrun.exe CreateFile D:\org\cli\cpn\APP\com 11:15:08.1301321 jrun.exe QueryDirectory D:\org\cli\cpn\APP\com\HtmlUtils.cfc 11:15:08.1301843 jrun.exe CloseFile D:\org\cli\cpn\APP\com 11:15:08.1312049 jrun.exe CreateFile D:\org\cli\cpn\APP\com\HtmlUtils.cfc 11:15:08.1314409 jrun.exe QueryBasicInformationFile D:\org\cli\cpn\APP\com\HtmlUtils.cfc 11:15:08.1314633 jrun.exe CloseFile D:\org\cli\cpn\APP\com\HtmlUtils.cfc 11:15:08.1315881 jrun.exe CreateFile D:\ 11:15:08.1316379 jrun.exe QueryDirectory D:\org 11:15:08.1316926 jrun.exe CloseFile D:\ 11:15:08.1330951 jrun.exe CreateFile D:\org 11:15:08.1338656 jrun.exe QueryDirectory D:\org\cli 11:15:08.1339118 jrun.exe CloseFile D:\org 11:15:08.1526468 jrun.exe CreateFile D:\org\cli 11:15:08.1527295 jrun.exe QueryDirectory D:\org\cli\cpn 11:15:08.1527989 jrun.exe CloseFile D:\org\cli 11:15:08.1531977 jrun.exe CreateFile D:\org\cli\cpn 11:15:08.1532589 jrun.exe QueryDirectory D:\org\cli\cpn\APP 11:15:08.1533575 jrun.exe CloseFile D:\org\cli\cpn 11:15:08.1538457 jrun.exe CreateFile D:\org\cli\cpn\APP 11:15:08.1539083 jrun.exe QueryDirectory D:\org\cli\cpn\APP\com 11:15:08.1539553 jrun.exe CloseFile D:\org\cli\cpn\APP 11:15:08.1544126 jrun.exe CreateFile D:\org\cli\cpn\APP\com 11:15:08.1544980 jrun.exe QueryDirectory D:\org\cli\cpn\APP\com\HtmlUtils.cfc 11:15:08.1545482 jrun.exe CloseFile D:\org\cli\cpn\APP\com 11:15:08.1551034 jrun.exe CreateFile D:\org\cli\cpn\APP\com\HtmlUtils.cfc 11:15:08.1552878 jrun.exe QueryBasicInformationFile D:\org\cli\cpn\APP\com\HtmlUtils.cfc 11:15:08.1553044 jrun.exe CloseFile D:\org\cli\cpn\APP\com\HtmlUtils.cfc Thanks

    Read the article

  • Expected IOPS for log writing on PS6000X SAN?

    - by dssz
    Customer is experiencing poor Sybase ASE 15 performance on a PS6000X SAN with 16 X 450GB 10K in RAID-50. The server is a Dell R710 running 2003 server R2 64bit in ESX 4.0.0,256968 I've used sqlio to benchmark the sequential write performance of 4KB blocks on the drive. sqlio -kW -t1 -s600 -dE -o1 -fsequential -b4 -BH -LS sqliotestfile.dat Result is 1900 IOPS. However, when Sybase is running a sustained workload of small inserts SAN HQ shows a consistent 590 IOPS (and 100% 4K write activity). It also shows that the write latency increases to 1.2ms from <1ms. Monitoring and tests in Sybase demonstrate the performance problem is IO related and in particular there is a lot of wait time writing to the log. The SAN indicates that write caching is enabled. What IOPS should the SAN be capable of for 4k sequential write activity? Also, with write caching enabled, shouldn't the controller be batching up the 4K writes into something more efficient? Also, any tips on Sybase on ESX would be appreciated.

    Read the article

  • ESX 4.0 space: DASD, NAS, or ?

    - by thormj
    I put together an ESX box for better management, but its performance is a WTF item; I'm a noob at dealing with ESX, so I'm looking for a laundry-list of reading material to help me straighten this out so I can go back to .NET programming. Current storage system: We're running Raid5+Hotspare (8x500 GB spindles) on a PERC6i on a Dell 2910. Due to ESX limitatios, the PERC is showing the storage as 1x2TB + 1x800GB "partitions." I'm not sure of the setup's configuration (stride / stripe / ???) at all. Our Applications We have a SBS server as well as a minor (2x50 GB, but growing at 10GB/month) database server... Our application that lives on the database VM is CPU and I/O insense; it's a database churning excercise mixed in with a lot of computation on the data (fixing that performance is what I'm supposed to be working on)... Perfomance Issue When I do a backup, restore, or worse (copy a backup from 1 vm to another to move it to the QA VM), the entire system slows to a crawl (even "unrelated" VMs). I originally thought a DASD situation would be quite good since you had PCI-x bandwidth, but the systemwide slowdown is killing productivity. Questions What should I do to make an intelligent decision about NAS vs RAID vs SAN vs DASD? Are there sweet spots/ugly spots in the storage setup? Can you use a SSD PCI-X card in ESX for the tempdb? Good/Bad idea? Is there any way to "share" some image in a copy-on-write fashion? Most of the "Backup-Copy-Restore" is to "put a clean image on the dev boxes"; if I could have them "share" the master image, the "big copy" (2x50 GB) would only need to be done once per week instead of once per dev per week...[runtime performance isn't a concern with the dev boxes, but the backup/copy/restore kills production, SBS, and everything else on the box]

    Read the article

  • Building a PC, advice on SSD/Hybrid Hard Drives

    - by Jamie Hartnoll
    I am looking at building a new PC, it's mainly for office (graphics heavy) use and programming. Looking for good performance with opening and closing programs and files as well as a fast boot. I plan to have 3 primary hard drives Windows 7 Programs (photoshop etc) Current Files (There'll also be a large storage capacity back up drive, but this will be the Seagate drive I already have.) So, my question is, looking at standard "old fashioned" hard drives and SSD drives, obviously there's a massive price difference. I have been looking at drives like this: http://www.ebuyer.com/268693-corsair-120gb-force-3-ssd-cssd-f120gb3-bk-cssd-f120gb3-bk and this: http://www.ebuyer.com/321969-momentus-xt-750gb-sata-2-5in-7200rpm-hybrid-8gb-ssd-in-st750lx003 Having no experience of using either I don't know what's the most efficient thing to go for. Clearly the SSD will have better performance, but: If, for example, I had an SSD for Windows (say about 100gB), that would clearly give me the boot speed I want, then I guess my real questions are: If I were to buy one more SSD, would it give the greatest improvement on standard performance if used to store programs, or currently used files? Given that the OS is on an SSD, should I not bother with the 3 drives and instead, partition that Hybrid drive to store programs and currently used files on it? Obviously, option two is cheaper and option one could cause me storage issues, but that's when I can dump files I am not currently using onto another drive. Any, I am open to suggestions... so what do you suggest?!

    Read the article

  • Would an array of SSD drives be able to succesfully substitute the system memory?

    - by Florin Mircea
    I watched a few videos trying to answer this. This video (youtube.com/watch?v=eULFf6F5Ri8) shows a bunch of guys stacking 24 SSD's reaching a peak of around 2GBps r/w. That's under the limit of the worst DDR3 in this list (memorybenchmark.net/write_ddr3_amd.html) - that shows DDR3 memory performance varying from 2.78 to 6.55 Gb per second, but that video is over 3 years old. This video (youtube.com/watch?v=27GmBzQWwP0) shows a more optimistic situation, but for PCI-E SSD drives: 5 drives peaking at around 4Gb. And this other video shows that stacking up more than 3 SSD's doesn't realistically offer a substantial added performance. This and the fact that in all benchmarks the drives act quite poorly when dealing with small files (5k file read/write averaging from 10MB to around 30-40MBps) as opposed to how native memory handles such files, seems to indicate a definite NO to this question. Also, the write life cycle is indeed limited and the drives might wear out quickly, as kindly pointed out by paddy. However, I wanted to get more opinions on this. Would it be possible to at least obtain current memory performance with SSD's in RAID 0? And if so, in what circumstances? I am assuming using this configuration with a Windows OS that has a memory pagefile resident to that stack of SSD's, thus making it very fast to work with.

    Read the article

  • WAMP running extremely slow on WIndows 7

    - by JavaCake
    After 2 days of tough fight trying to figure out what the problem is with my Windows 7 32-bit machine at work i have nearly given up. The issue is that the pages are loaded extremely slow, the performance is both when accessed locally (127.0.0.1) or from another computer in the intranet. First to explain the system: WAMP version: Apache 2.2.22 – Mysql 5.5.24 – PHP 5.4.3 XDebug 2.1.2 XDC 1.5 PhpMyadmin 3.4.10.1 SQLBuddy 1.3.3 webGrind 1.0 DocumentRoot: Located on network drive MySQL: InnoDB Pages: PHP, MySQL, AJAX etc. So basically the changes i have made in order to get a greater performance: Changed C:\windows\system32\drivers\etc\hosts: 127.0.0.1 localhost 127.0.0.1 127.0.0.1 Modified my.ini: innodb_flush_log_at_trx_commit = 2 Modified httpd.ini: EnableMMAP on EnableSendfile on Modified php.ini: realpath_cache_size= 4m How i measure the performance is the overall loadtime of the page. I run it locally on my Mac OS X machine aswell (MAMP), and typically the frontpage loadtime is 0.06seconds but on the Windows 7 machine it is 6-10seconds. I have verified the loadtime with developertools in Chrome aswell. Furthermore the result is identical in XAMPP.

    Read the article

  • Data recovery; nearly 1 tb of movies on a WD 3.5 tb personal cloud drive disappears with scanty traces

    - by Effector Dhanushanth
    I have a great collection of movies that I had stored in a logical mesh of folder on my 3.5 tb WD personal cloud drive. I woke up 1 morning and found that everything was fine with my data on this drive, except for my movie collection: There were two great folders, one "2sort" nd the other "segregated". out of all the segregated sub folders, only letter C D and 2 or 3 others remain. and the 2 sort folder, which has umpteen subfolders, amounting to more than 0.5 tb. is.. it's just gone!! this is a great downfall.. now this is a personal cloud drive and has no usb port etc. unfortunately to hardwire and recover files.. now I'm sure there are softwares out there that can help me recover my beloved movies from such an interestingly "hard-to-reach" (should I say?) device? what may that software be compadre, my happiness lies within your answer.. thank you.. remember, recovery software or (WD) personal cloud. :) these ovies were All, "hand-picked", over the course of ten years.. I just never catalogued my collection.. if I could just get the "list" of my lost collection, that'd be enough.. recovering em would be a bonus.. but they out to be damaged if I were to somehow recover you know? still, I'm certain they're all intact.. I guess the file index just got corrupted.. There surely is a veil of some sort that need to be thrown or pushed aside to reveal my movies.. what software can do/does that? thanks immensely!

    Read the article

  • Java file manager won't load in Firefox

    - by Arthur
    I am using Webmin and I can't get the file manager to load in Firefox. It is a simple, Java based file manager. When I try to load it I get the following error: This module requires java to function, but your browser does not support java Internet Explorer works fine and I have yet to try on Chrome. Java is installed and I have the same problem on Windows and Linux. Java seems to work fine with everything else, with the exception of webcams. Any advice on the issue would be appreciated. Edit: I just checked and it doesn't work in Chrome either.

    Read the article

  • Load website and fill form from the command line

    - by Martin Scharrer
    Using the Linux command line (Bash shell) I like to load a specific website in my browser (normally Firefox, but other one would be ok as well as long it runs under Linux) and fill a pre-defined form with some data. Actually, this should run from a Makefile. Most of the form data is static and will be stored as variables in the Makefile, just some fields are to be filled manually before manually sending the form. I know how to load the website in question from the command line using: firefox <URL> But there seems no possibility to fill the form automatically with variables given on the command line. Is there a plugin, executable or JavaScript which allows me to do this? Any suggestions and hints are welcome. I don't mind coding some JavaScript.

    Read the article

  • MySQL Database Replication and Server Load

    - by Willy
    Hi Everyone, I have an online service with around 5000 MySQL databases. Now, I am interested in building a development area of the exactly same environment in my office, therefore, I am about to setup MySQL replication between my live MySQL server and development MySQL server. But my concern is the load which will occur on my live MySQL server once replication is started. Do you have any experience? Will this process cause extra load on my production server? Thanks, have a nice weekend.

    Read the article

< Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >