Search Results

Search found 10417 results on 417 pages for 'large'.

Page 173/417 | < Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >

  • Fibonnaci Sequence fast implementation

    - by user2947615
    I have written this function in Scala to calculate the fibonacci number given a particular index n: def fibonacci(n: Long): Long = { if(n <= 1) n else fibonacci(n - 1) + fibonacci(n - 2) } However it is not efficient when calculating with large indexes. Therefore I need to implement a function using a tuple and this function should return two consecutive values as the result. Can somebody give me any hints about this? I have never used Scala before. Thanks!

    Read the article

  • OGNL thread safety

    - by Dewfy
    I'm going to reuse OGNL library out of Struts2 scope. I have rather large set of formulas, that is why I would like to precompile all of them: Ognl.parseExpression(expressionString); But I'm not sure if precompiled expression can be used in multi-thread environment. Does anybody knows if it can be used?

    Read the article

  • Does anyone know of a good guide to configure GC in Java?

    - by evilpenguin
    I'm having trouble with a JVM running an app, whose heap memory looks like a comb. It's constantly jumping from 1.5 GB to 3 GB and slowly deteriorating to higher values. I'm using G1 GC algorithm, but have no idea how to configure it. I do not have access to the code of the app I'm running and, needless to say, it's a rather large app. So, bottom line, does anyone know of a good guide to configure GC in Java?

    Read the article

  • Use Awk to Print every character as its own column?

    - by wizkid84
    Hi there, I am in need of reorganizing a large CSV file. The first column, which is currently a 6 digit number needs to be split up, using comma's as the field separator. For example, I need this: 022250,10:50 AM,274,22,50 022255,11:55 AM,275,22,55 turned into this: 0,2,2,2,5,0,10:50 AM,274,22,50 0,2,2,2,5,5,11:55 AM,275,22,55 Let me know what you think! Thanks!

    Read the article

  • Detect *target* file size using JavaScript

    - by noblethrasher
    Hi, Would like to write a script to detect the file size of the target of a link on a web page. Right now I have a function that finds all links to PDF files (i.e. the href ends with '.pdf') and appends the string '[pdf]' to the innerText. I would like to extend it so that I can also append some text advising the user that the target is a large file (e.g. greater than 1MB). Thanks

    Read the article

  • Database design advice needed.

    - by user346271
    Hi all, I'm a lone developer for a telecoms company, and am after some database design advice from anyone with a bit of time to answer. I am inserting into one table ~2 million rows each day, these tables then get archived and compressed on a monthly basis. Each monthly table contains ~15,000,000 rows. Although this is increasing month on month. For every insert I do above I am combining the data from rows which belong together and creating another "correlated" table. This table is currently not being archived, as I need to make sure I never miss an update to the correlated table. (Hope that makes sense) Although in general this information should remain fairly static after a couple of days of processing. All of the above is working perfectly. However my company now wishes to perform some stats against this data, and these tables are getting too large to provide the results in what would be deemed a reasonable time. Even with the appropriate indexes set. So I guess after all the above my question is quite simple. Should I write a script which groups the data from my correlated table into smaller tables. Or should I store the queries result sets in something like memcache? I'm already using mysqls cache, but due to having limited control over how long the data is stored for, it's not working ideally. The main advantages I can see of using something like memcache: No blocking on my correlated table after the query has been cashed. Greater flexibility of sharing the collected data between the backend collector and front end processor. (i.e custom reports could be written in the backend and the results of these stored in the cache under a key which then gets shared with anyone who would want to see the data of this report) Redundancy and scalability if we start sharing this data with a large amount of customers. The main disadvantages I can see of using something like memcache: Data is not persistent if machine is rebooted / cache is flushed. The main advantages of using MySql Persistent data. Less code changes (although adding something like memcache is trivial anyway) The main disadvantages of using MySql Have to define table templates every time I want to store provide a new set of grouped data. Have to write a program which loops through the correlated data and fills these new tables. Potentially will still grow slower as the data continues to be filled. Apologies for quite a long question. It's helped me to write down these thoughts here anyway, and any advice/help/experience with dealing with this sort of problem would be greatly appreciated. Many thanks. Alan

    Read the article

  • Higher level database layer for Android?

    - by sweetiecakes
    Are there any good database abstraction layers/object relational mappers/ActiveRecord implementations/whatever they are called for Android? I'm aware that db4o is officially supported, but it has quite a large footprint and I'd rather use a more conventional database (SQLite).

    Read the article

  • How to switch from VARCHAR to TEXT in SQL 2000?

    - by MatthewMartin
    What do I need to consider before I switch a bunch of fields from VARCHAR(bignumber) to TEXT? Aside from performance, and sometime in the far future TEXT will be deprecated, and aside from the fact that it looks like I need to drop and recreate the table to alter the column's data type? This is for SQL 2000-- I can't do VARCHAR(max) and VARCHAR(8000) isn't large enough.

    Read the article

  • Best way to get a query result

    - by xgoan
    I'm developing an application that gets large images from an Internet server which is the best way to download this images, without freeze the entire application? I mean background download. I have thought about download it in another thread.

    Read the article

  • Obtaining memory available to JVM at runtime

    - by Bo Tian
    I'm trying to sort a bunch of data such that that the size of data input to the program can be larger than the memory available to the JVM, and handling that requires external sort which is much slower than Quicksort. Is there any way of obtaining memory available to the JVM at runtime such that I could use in place sorting as much as possible, and only switch to Mergesort when data input is too large?

    Read the article

  • Cocoa touch: decorating text

    - by user365904
    I've added a UIAppFont to my plist, and, happily, am able to write a custom font to my display. Now, if I had to display this custom font in a very large size with a yellow outline and purple in the middle- how in the world would I achieve that??

    Read the article

  • Number of elements in Python Set

    - by Tim
    I have a list of phone numbers that have been dialed (nums_dialed). I also have a set of phone numbers which are the number in a client's office (client_nums) How do I efficiently figure out how many times I've called a particular client (total) For example: >>>nums_dialed=[1,2,2,3,3] >>>client_nums=set([2,3]) >>>??? total=4 Problem is that I have a large-ish dataset: len(client_nums) ~ 10^5; and len(nums_dialed) ~10^3.

    Read the article

  • How to program a text search and replace in PDF files

    - by rpilkey
    How would I be able to programmatically search and replace some text in a large number of PDF files? I would like to remove a URL that has been added to a set of files. I have been able to remove the link using javascript under Batch Processing in Adobe Pro, but the link text remains. I have seen recommendations to use text touchup, which works manually, but I don't want to modify 1300 files manually.

    Read the article

  • Why is J2EE scalable?

    - by py213py
    I heard from various sources that J2EE is highly scalable, but to me it seems that you could never scale a J2EE application to the level of the google search engine or any other large website. I would like to hear the technical reasons why it is so scalable.

    Read the article

  • Using many mutex locks

    - by hanno
    I have a large tree structure on which several threads are working at the same time. Ideally, I would like to have an individual mutex lock for each cell. I looked at the definition of pthread_mutex_t in bits/pthreadtypes.h and it is fairly short, so the memory usage should not be an issue in my case. However, is there any performance penalty when using many (let's say a few thousand) different pthread_mutex_ts for only 8 threads?

    Read the article

  • Is Cassandra database row size limited by available memory?

    - by Adam Hollidge
    I'm working with very long time series -- hundreds of millions of data points in one series -- and am considering Cassandra as a data store. In this question, one of the Cassandra committers (the über helpful jbellis) says that Cassandra rows can be very large, and that column slicing operations are faster than row slices, hence my question: Is the row size still limited by available memory?

    Read the article

  • Should I HttpCombine Google Jquery Hosted File?

    - by chobo2
    Hi I am using something called HttpCombiner: http://code.msdn.microsoft.com/HttpCombiner An HTTP handler that combines multiple CSS, Javascript or URL into one response for faster page load. It can combine, compress and cache response which results in faster page load and better scalability of web application It's a good practice to use many small Javascript and CSS files instead of one large Javascript/CSS file for better code maintainability, but bad in terms of website performance. Although you should write your Javascript code in small files and break large CSS files into small chunks but when browser requests those javascript and css files, it makes one Http request per file. Every Http Request results in a network roundtrip form your browser to the server and the delay in reaching the server and coming back to the browser is called latency. So, if you have four javascripts and three css files loaded by a page, you are wasting time in seven network roundtrips. Within USA, latency is average 70ms. So, you waste 7x70 = 490ms, about half a second of delay. Outside USA, average latency is around 200ms. So, that means 1400ms of waiting. Browser cannot show the page properly until Css and Javascripts are fully loaded. So, the more latency you have, the slower page loads. You can reduce the wait time by using a CDN. Read my previous blog post about using CDN. However, a better solution is to deliver multiple files over one request using an HttpHandler that combines several files and delivers as one output. So, instead of putting many or tag, you just put one and one tag, and point them to the HttpHandler. You tell the handler which files to combine and it delivers those files in one response. This saves browser from making many requests and eliminates the latency. This Http Handler reads the file names defined in a configuration and combines all those files and delivers as one response. It delivers the response as gzip compressed to save bandwidth. Moreover, it generates proper cache header to cache the response in browser cache, so that, browser does not request it again on future visit. Now I am wondering since it can handle adding links should I put in it the jquery file? The reason I am not sure is if it gets combined with my other files I think I might close the advantages of it being hosted on googles servers such as caching(my thinking is if it gets combined it will look different so even if a user has it in it's cache I am not sure if it will use the one for the cahce or not). So should I combine it or only the finals that I am using locally?

    Read the article

  • Suggestion for developing search engine

    - by MohamedGooner
    I want to develop a simple search engine, using ASP.NET and C# , where I can search for a word which contained in a very big text (like the Holy Bible or something like that), then the program shows the user where the word is. I have no idea about in which database I can put this large text and using which method will I search for a word. Any suggestions will help me, and if anyone have a tutorial for anything similar it will benefit me.

    Read the article

  • In an AVL tree, at what condition the balancing is to be done? proper code in c languge

    - by bachchan
    Binary search follows Divide and Conquer method where as linear Search doesn't follw.The time complexity of Binary Search in O(log n) but incase of linear search the time complexity is O(n). Thats way Binary search is having bettr prior than linear search. But it is true when the list of items is large incase of smaller list linear is best(i.e.- it is only when the Best Case concern)

    Read the article

  • A SELECT statement for Mysql

    - by Hossein
    I have this table: id,bookmarkID,tagID I want to fetch the top N bookmarkIDs for a given list of tags. Does anyone know a very fast solution for this? the table is quite large(12 million records) I am using MySql

    Read the article

< Previous Page | 169 170 171 172 173 174 175 176 177 178 179 180  | Next Page >