Search Results

Search found 6355 results on 255 pages for 'slow downs'.

Page 182/255 | < Previous Page | 178 179 180 181 182 183 184 185 186 187 188 189  | Next Page >

  • problem in showing utf8 letters in jquery

    - by Mac Taylor
    hey guys i made a jquery script similar to wordpress that can show slugged title from an input box while user is typing $(function() { $(".word").keyup(function() { var word= $(this).val(); var inputmirror = $('#plink'); inputmirror.val(word); $.post("slug.php"+"&title="+word, function(data){ $('#preshow').html(data).fadeIn("slow"); }); return false; }); }); and php file : $title = $_GET['title']; if (!empty($title)) { echo Slugit($title); } everything works fine unless when i enter arabic letters in input box it shows coded characters like this : http://mysite.info/this-is-arabic-%d8%a7%d9%84%d9%84%d9%87

    Read the article

  • understanding memory mapping in directx

    - by numerical25
    So my question is ... " When your using the mapping feature to write into a memory buffer, are you really just saving the whole procedure into a queue so directX executes it when finished with other tasks???" I ask this question because this is my perception of mapping when writing to a buffer. I just want to make sure my perception is correct. I understand that the monitor moves extremely slow in compared to the processor, and I am sure the processor can execute 10 times the amount the screen can refresh. So is this one of the reason you should map when writing to a buffer. so each procedure can be done in a orderly fashion. If someone could elaborate, that would be great. thanks

    Read the article

  • Advantages of using WCF to work with Sharepoint Services WSS3.0?

    - by val
    Hi folks, what is your opinion or better off your practical experience using WCF to work with WSS instead of SP web services? I am writing some custom library for our software to store and retrieve files from WSS document libraries using sharepoint web services. I am not entirely happy with the performance of the sp web services - a bit too slow in many cases. Now, microsoft claims a significant improvements in WCF over remoting and I am looking into a good way to use WCF for my file services. Any suggestions or ideas? Maybe a good source of coding practices or blogs? Thanks a lot, Val

    Read the article

  • C# Dynamic From Components (Performance problem)

    - by Svisstack
    Hello, I have a problem with performance of my code under Windows Forms. Have a form, her layout is depending on constructor data, because he layout must be OnLoad or in Constructor generated. I generation is simple, base FlowLayoutPanel have other FlowLayoutPanels, for each have a Label and TextBox with DataBinding. Problem is this is VERY SLOW, up to 20 seconds, i drawing less than 100 controls, from Performace Session i know a problem is on 70% procesing functions: System.Windows.Forms.Control.ControlCollection.Add(class System.Windows.Forms.Control) System.Windows.Forms.ControlBindingsCollection.Add(class System.Windows.Forms.Binding) How i can do with this? Anyone help me in this problem? How solve the dynamic form layout problem?

    Read the article

  • Connect Android application to remote data

    - by tadywankenobi
    Sheesh talk about limited information! I'm trying to get my Android application to connect to an online database to access information. There's quite a bit of info including geotags and these are going to be mapped on my app. The developer site has the very informative piece of information: You can use the network (when it's available) to store and retrieve data on your own web-based services. To do network operations, use classes in the following packages: java.net.* android.net.* Like I said in my previous question, I'm still very much an android newbie, and trying to remember my java oop from college is slow. Does anyone have an example of how this might work, or how I could implement it? I wouldn't mind even connecting to a local xml file, if I could find a good example of how to do that!? Am I just looking in all the wrong places?! Help. Please! T

    Read the article

  • group_concat on an empty join in MySQL

    - by Yossarian
    Hello, I've got the following problem: I have two tables: (simplified) +--------+ +-----------+ | User | | Role | +--------+ +-----------+ | ID<PK> | | ID <PK> | +--------+ | Name | +-----------+ and M:N relationship between them +-------------+ | User_Role | +-------------+ | User<FK> | | Role<FK> | +-------------+ I need to create a view, which selects me: User, and in one column, all of his Roles (this is done by group_concat). I've tried following: SELECT u.*, group_concat(r.Name separator ',') as Roles FROM User u LEFT JOIN User_Role ur ON ur.User=u.ID LEFT JOIN Role r ON ur.Role=r.ID GROUP BY u.ID; However, this works for an user with some defined roles. Users without role aren't returned. How can I modify the statement, to return me User with empty string in Roles column when User doesn't have any Role? Explanation: I'm passing the SQL data directly to a grid, which then formats itself, and it is easier for me to create slow and complicated view, than to format it in my code. I'm using MySQL

    Read the article

  • How to prevent command/script from changing global environment

    - by guillermooo
    I need to run scriptblocks/scripts from the current top-level shell and I want them to leave the global environment unmodified. So far, I've only been able to think of the following possibilities: powershell -file <script> powershell -noprofile -command <scriptblock> The problem is, that they are very slow. For instance, I would like to be able to do: mkdir newdir cd newdir $env:NEW_VAR = 100 ni -item f 'newfile.txt' ...so that my shell's working dir wouldn't change and $env:NEW_VAR wouldn't be set in the global environment. Are there any more alternatives to accomplish this?

    Read the article

  • how to re-factor a web site for 3G?

    - by George2
    Hello everyone, I have a traditional web site which serves users from desktop computer browsers. I am using Microsoft technologies, like ASP.Net, C#, .Net, SQL Server 2008, IIS and Windows Server 2008. Nowadays, more and more users are using 3G mobile phones, and I am wondering from software perspective, how to add new features to my web site (do I need a client application runs on mobile phone as well?) so that 3G users could have good user experience or new kinds of 3G specific applications? Any recommended documents or real samples are welcome. For 3G users, I want to distinguish from traditional less-powered and slow network access GPRS mobile phone. thanks in advance, George

    Read the article

  • How do I write a Java text file viewer for big log files

    - by Hannes de Jager
    I am working on a software product with an integrated log file viewer. Problem is, its slow and unstable for really large files because it reads the whole file into memory when you view a log file. I'm wanting to write a new log file viewer that addresses this problem. What are the best practices for writing viewers for large text files? How does editors like notepad++ and VIM acomplish this? I was thinking of using a buffered Bi-directional text stream reader together with Java's TableModel. Am I thinking along the right lines and are such stream implementations available for Java?

    Read the article

  • Maximum number of files in one ext3 directory while still getting acceptable performance?

    - by knorv
    I have an application writing to an ext3 directory which over time has grown to roughly three million files. Needless to say, reading the file listing of this directory is unbearably slow. I don't blame ext3. The proper solution would have been to let the application code write to sub-directories such as ./a/b/c/abc.ext rather than using only ./abc.ext. I'm changing to such a sub-directory structure and my question is simply: roughly how many files should I expect to store in one ext3 directory while still getting acceptable performance? What's your experience? Or in other words; assuming that I need to store three million files in the structure, how many levels deep should the ./a/b/c/abc.ext structure be? Obviously this is a question that cannot be answered exactly, but I'm looking for a ball park estimate.

    Read the article

  • iPhone UIImage number recognition

    - by Skeep
    Hi All, I have a small UIImage (jpg) with a single typed number. I want to be able to read the number with some kind of pattern recognition. I'm really not sure where to start, so any help would be appreciated. my initial idea was to compare this image with other images. For instance compare the image with that of a 1,2,3, etc until a match was found. That just seems slow and cumbersome and wondered if there was a better way to do it? Thanks

    Read the article

  • how to submit using link with my current code(php+jquery+ajax)

    - by ruslyrossi
    My current Code : <script type="text/javascript"> $(document).ready(function() { $('#form').ajaxForm( { target: '#preview', success: function() { $('#success_box').addClass('success') setTimeout(function() { $('#success_box').fadeOut('slow'); }, 3100); // <-- time in milliseconds } }); }); </script> <form action="controller/product_edit_update.php" method="post" id="form" name="form" > bla..bla.. <input type="submit" value="Save" /></form> but now i want to add submit link with <a href="" id="submit_with_link" >Save</a>

    Read the article

  • web page db query optimisation

    - by morpheous
    I am putting together a web page which is quite 'expensive' in terms of Db hits. I dont want to start optimizing at this stage - though with me trying to hit a deadline, I may end up not optimising at all. Currently the page requires 18 (thats right eighteen) hits to the db. I am already using joins, and some of the queries are UNIONed to minimize the trips to the db. My local dev machine can handle this (page is not slow) however, I feel if I release this into the wild, the number of queries will quickly overwhelm my database (mySQL). I could always use memcache or something similar, but I would much rather continue with my other dev work that needs to be completed before the deadline - at least retrieving the page work, its simply a matter of optimization. My question therefore is - is 18 db queries for a single page retrieval completely outrageous - (i.e. I should put everything on hold and optimize the hell of the retrieval logic), or shall I continue as normal, meet the deadline and release on schedule and see what happens?

    Read the article

  • SQL Server 2008 + expensive union all

    - by Tim Mahy
    Hi al, we have 5 tables over which we should query with user search input throughout a stored procedure. We do a union all of the similar data inside a view. Because of this the view can not be materialized. We are not able to change these 5 tables drastically (like creating a 6th table that contains the similar data of the 5 tables and reference that new one from the 5 tables). The query is rather expensive / slow what are our other options? It's allowed to think outside the box. Unfortunately I cannot give more information like the table/view/SP definition because of customer confidentiality... greetings, Tim

    Read the article

  • Using pow() for large number

    - by g4ur4v
    I am trying to solve a problem, a part of which requires me to calculate (2^n)%1000000007 , where n<=10^9. But my following code gives me output "0" even for input like n=99. Is there anyway other than having a loop which multilplies the output by 2 every time and finding the modulo every time (this is not I am looking for as this will be very slow for large numbers). #include<stdio.h> #include<math.h> #include<iostream> using namespace std; int main() { unsigned long long gaps,total; while(1) { cin>>gaps; total=(unsigned long long)powf(2,gaps)%1000000007; cout<<total<<endl; } }

    Read the article

  • PHP: Simulate multiple MySQL connections to same database

    - by Varun
    An insert query is constantly getting logged in my MySQL slow query log. I want to see how much time the INSERT query is taking with 100 simultaneous insert operations(to the same table). So I need to simulate the follwoing. 500 different simultaneous connections from PHP to the same database on a mysql server, all of which are inserting a row(simultaneously) to the same table. DO I need to use any load testing tool? Or Can I simply write a PHP script to do this? Any thoughts?

    Read the article

  • what do i need to do now that I want to take programming hobby to next level ?

    - by hohog
    i've always wanted to make games but did not start actively learning programming by myself until 1st year of university. i kept going throughout university learning new languages, showing off things i had made, while neglecting my major in Biology. Anyways, i've ended up with an Economics degree, with a portfolio of SaaS and web apps i had created so i could eat during my final year. So far, I'm getting a few interviews here and there in web programming positions. When I get a logic pretest, I fail miserably. or job requires comp sci degree. I mean I can easily design and code an entire app which I emphasize through my portfolio.... but i dont know why I am so slow at logic puzzles on prescreening interview... So what should I do now ? get certificates in languages ? go back to school and learn CS ? is it too late to get into windows programming jobs than web programming ?

    Read the article

  • SQL Server 2005 - Understanding ouput of DBCC SHOWCONTIG

    - by user169743
    I'm seeing some slow performance on a SQL Server 2005 database. I've been doing some research regarding SQL Server performance but I'm having difficulty fully understanding the output of SHOWCONTIG and would be very grateful if someone could have a look and offer some suggestions to improve performance. TABLE level scan performed. Pages Scanned................................: 19348 Extents Scanned..............................: 2427 Extent Switches..............................: 3829 Avg. Pages per Extent........................: 8.0 Scan Density [Best Count:Actual Count].......: 63.16% [2419:3830] Logical Scan Fragmentation ..................: 8.40% Extent Scan Fragmentation ...................: 35.15% Avg. Bytes Free per Page.....................: 938.1 Avg. Page Density (full).....................: 88.41%

    Read the article

  • How to do a batch update?

    - by chobo2
    Hi I am wondering is there a way to do batch updating? I am using ms sql server 2005. I saw away with the sqlDataAdaptor but it seems like you have to first the select statement with it, then fill some dataset and make changes to dataset. Now I am using linq to sql to do the select so I want to try to keep it that way. However it is too slow to do massive updates. So is there away that I can keep my linq to sql(for the select part) but using something different to do the mass update? Thanks

    Read the article

  • Sorting 1000-2000 elements with many cache misses

    - by Soylent Graham
    I have an array of 1000-2000 elements which are pointers to objects. I want to keep my array sorted and obviously I want to do this as quick as possible. They are sorted by a member and not allocated contiguously so assume a cache miss whenever I access the sort-by member. Currently I'm sorting on-demand rather than on-add, but because of the cache misses and [presumably] non-inlining of the member access the inner loop of my quick sort is slow. I'm doing tests and trying things now, (and see what the actual bottleneck is) but can anyone recommend a good alternative to speeding this up? Should I do an insert-sort instead of quicksorting on-demand, or should I try and change my model to make the elements contigious and reduce cache misses? OR, is there a sort algorithm I've not come accross which is good for data that is going to cache miss?

    Read the article

  • how to animate a menu item into a large div (window) using jquery's animate?

    - by ijjo
    i'm pretty sure this can be done pretty easily with jquery's animate api, but i'm not good enough to figure it out. what i want to do is this: i have a menu item at the top of the viewport that the user will click on. when the user clicks on it, you'll see something that looks like the div "popping" out of the menu and float to a particular location on the screen. when i say popping i don't mean anything fancy - i just mean it appears to be originating from the menu item and settling somewhere on the screen that i specify. but the important part is that this animation happens really fast. fast enough where you don't have to actually wait for the window to appear, but slow enough so the eye sees the animation start from the menu item and end up at a new location where the window will actually appear, and appear with a specify height and width. hope that all made sense?

    Read the article

  • How should I capture clickstream data?

    - by editor
    I'd like to start using clickstream analysis to improve a dynamic site's user experience. I'd like to rule out two options: parameterizing URLs (index.php?src=http://www.example.com) and immediate database logging. The former makes pretty ugly URLs and isn't great for SEO and the latter might slow down page render when there are lots of concurrent users. Assuming these aren't viable options, I think I'm left with doing an asynchronous POST to a server side script that runs a database query and returns a 204 (no data) response. Is this the best option for capturing clickstream data?

    Read the article

  • WCF client hangs on response

    - by JohnIdol
    I have a WCF client (running on Win7) pointing to a WebSphere service. All is good from a test harness (a little test fixture outside my web app) but when my calls to the service originate from my web project one of the calls is extremely slow to deserialize (it takes up to 10 times longer) and not just the first time. I can see from fiddler that the response comes back quickly but then the WCF client hangs on the response itself for more than a minute before the next line of code is hit by the debugger, almost if the client was having trouble deserializing. This happens only if in the response I have a given pdf string, base64 encoded chunked. If for example the service raises a fault (this pdf string is not there) then the response is deserialized immediately. Again, If I send the exact same envelope through Soap-UI or from outside the web project all is good. I am at loss - What should I be looking for and is there some config setting that might do the trick? Any help appreciated!

    Read the article

  • Query to update rowNum

    - by BrokeMyLegBiking
    Can anyone help me write this query more efficiently? I have a table that captures TCP traffic, and I'd like to update a column called RowNumForFlow which is simly the sequential number of the IP packet in that flow. The code below works fine, but it is slow. declare @FlowID int declare @LastRowNumInFlow int declare @counter1 int set @counter1 = 0 while (@counter1 < 1) BEGIN set @counter1 = @counter1 + 1 -- 1) select top 1 @FlowID = t.FlowID from Traffic t where t.RowNumInFlow is null if (@FlowID is null) break -- 2) set @LastRowNumInFlow = null select top 1 @LastRowNumInFlow = RowNumInFlow from Traffic where FlowID=@FlowID and RowNumInFlow is not null order by ID desc if @LastRowNumInFlow is null set @LastRowNumInFlow = 1 else set @LastRowNumInFlow = @LastRowNumInFlow + 1 update Traffic set RowNumInFlow = @LastRowNumInFlow where ID = (select top 1 ID from Traffic where flowid = @FlowID and RowNumInFlow is null) END Example table values after query has run: ID FlowID RowNumInFlow 448923 44 1 448924 44 2 448988 44 3 448989 44 4 448990 44 5 448991 44 6 448992 44 7 448993 44 8 448995 44 9 448996 44 10 449065 44 11 449063 45 1 449170 45 2 449171 45 3 449172 45 4 449187 45 5

    Read the article

  • What is the most efficient method to find x contiguous values of y in an array?

    - by Alec
    Running my app through callgrind revealed that this line dwarfed everything else by a factor of about 10,000. I'm probably going to redesign around it, but it got me wondering; Is there a better way to do it? Here's what I'm doing at the moment: int i = 1; while ( ( (*(buffer++) == 0xffffffff && ++i) || (i = 1) ) && i < desiredLength + 1 && buffer < bufferEnd ); It's looking for the offset of the first chunk of desiredLength 0xffffffff values in a 32 bit unsigned int array. It's significantly faster than any implementations I could come up with involving an inner loop. But it's still too damn slow.

    Read the article

< Previous Page | 178 179 180 181 182 183 184 185 186 187 188 189  | Next Page >