Search Results

Search found 65999 results on 2640 pages for 'large data volumes'.

Page 167/2640 | < Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >

  • Quick backup system for large projects

    - by kamziro
    I've always backed up all my source codes into .zip files and put it in my usb drive and uploaded to my server somewhere else in the world.. however I only do this once every two weeks, because my project is a little big. Right now my project directories (I have a few of them) contains a hierarchy of c++ files in it, and interspersed with them are .o files which would make backing up take a while if not ignored. What tools exist out there that will let me just back things up efficiently, conveniently and lets me specify which file types to back up (lots of .png, .jpg and some text types in there), and which directories to be ignored (esp. the build dirs)? Or is there any ingenious methods out there that people use?

    Read the article

  • How can i return IEnumarable data from function in GridView with Entity FrameWork?

    - by programmerist
    protected IEnumerable GetPersonalsData() { // List personel; using (FirmaEntities firmactx = new FirmaEntities()) { var personeldata = (from p in firmactx.Personals select new { p.ID, p.Name, p.SurName }); return personeldata.AsEnumerable(); } } i wan to send GetPersonelData() into GridView DataSource. Like That: gwPersonel.DataSource = GetPersonelData(); gwPersonel.DataBind(); it monitored to me on : gwPersonel.DataBind(); this error: "The ObjectContext instance has been disposed and can no longer be used for operations that require a connection."

    Read the article

  • working with large sprite sheets on iphone

    - by lukya
    Hi All, I am trying to use sprite sheet animation in my application. The first POC with a small sprite sheet worked fine but as i change the sprite sheet to a bigger one, i get "check_safe_call: could not restore current frame" warning and the application quits. A quick search revealed that this problem meant my app is taking too much memory or the image is too huge in dimension. My image is 4.9 Mb and dimensions are 6720 * 10080 (oops!!). i read that iphone allows maximum 3 Mb image with dimensions up to 1024 * 1024. Also that the sprite sheet image dimensions should be a power of two. So please let me know how i can use a sprite sheet this big. One approach could be to cut the sprite sheet into many smaller sprite sheets and use them one at a time. Please suggest if you know any other/better approach to accommodate bigger sprite sheets and whether the problem with my sprite sheet is size (4.9 Mb) OR dimensions (6720 * 10080). (Just FYI, i am not trying to play a movie so using MP4 file instead is not an option for me. i need to animate the sprite sheet based on accelerometer input and i have been able to achieve that in my POC with smaller sprite sheet.) Thanks, Swapnil

    Read the article

  • Flipping View transition becomes clunky when using UIPickerViews with a large number of elements

    - by user302246
    I have a very simple app created with the Utility Application project template on XCode. My MainView has two UIPickerView components and two buttons. The FlipSideView has another UIPickerView. The pickers on the main view each have 4 segments and each segment has 8 rows. The picker on the flip side has just 1 segment with 8 rows. All rows on all pickers are just text. With just this setup, pressing the button to flip the view back and forth displays a noticeable delay before the animation actually starts, and then the animation actually seems to go faster than what it should, like it's trying to make up for the lost time. I removed the pickers in interface builder and loaded the app on the phone and the animation now seems natural. I also tried just one picker (the flipside one) and things still seem normal. So my current theory is that the number of objects involved in the main view is the cause. The thing is that I don't think it's that many (4 x 8 x 2 = 64), but I could be completely wrong. This is pretty much my first app so maybe I'm just doing something grossly wrong, or maybe the phone is has a lot more limited processing than I thought. I am thinking of creating the picker views with pickerView:viewForRow:forComponent:reusingView: to see if this hopefully performs better, but I'm not sure if this is just a waste of time. Any suggestions? Thanks Ruy P.S.: Testing on a 3G phone on 3.1.2

    Read the article

  • Usage of Maven (and open source in general) in high governance and risk-averse large organizations (

    - by bart
    Does anyone have any good stories of these kinds of organizations being open to using open source (such as tools like Maven etc). Many staff I've encountered have little or no exposure to open source/systems and open source is treated with great suspicion. Some reasons given for this are lack of support and robustness, which is ironic given the number of end-of-life unsupported vendor products that are in production. Bonus points for any success stories where you've seen open source go into orgs like this and have a real benefit!

    Read the article

  • How do I improve the efficiency of the queries executed by this generic Linq-to-SQL data access clas

    - by Lee D
    Hi all, I have a class which provides generic access to LINQ to SQL entities, for example: class LinqProvider<T> //where T is a L2S entity class { DataContext context; public virtual IEnumerable<T> GetAll() { return context.GetTable<T>(); } public virtual T Single(Func<T, bool> condition) { return context.GetTable<T>().SingleOrDefault(condition); } } From the front end, both of these methods appear to work as you would expect. However, when I run a trace in SQL profiler, the Single method is executing what amounts to a SELECT * FROM [Table], and then returning the single entity that meets the given condition. Obviously this is inefficient, and is being caused by GetTable() returning all rows. My question is, how do I get the query executed by the Single() method to take the form SELECT * FROM [Table] WHERE [condition], rather than selecting all rows then filtering out all but one? Is it possible in this context? Any help appreciated, Lee

    Read the article

  • Make process crash on large memory allocation

    - by Pieter
    I'm trying to find a significant memory leak (15MB at a time, but doing allocations like this on multiple places). I checked the most obvious places, and then used AQTime, but I still can't pinpoint it. Now I see 2 options left: 1) Use SetProcessWorkingSetSize: I've tried this but my process happily keeps on running when using up more then 150MB: DWORD MemorySize = 150*1024*1024; SetProcessWorkingSetSize( GetCurrentProcess(), MemorySize/2, MemorySize*2 ); 2) Put a breakpoint when allocating more then 1MB at a time. How should I do this, overload operator new with an 'if1MB' inside ?

    Read the article

  • Creating a Large Matrix in ff

    - by Ryan Rosario
    I am trying to create a huge matrix in ff, and I know that ff is good for this sort of thing. But, there is a major problem. The dimensions of the matrix exceed .Machine$max_integer! I am running on a 64 bit machine, using 64bit R and 64bit ff. Is there any way to get around this problem? It's been suggested that R is using the MAXINT value from stdint.h. Is there any way to fix this without changing that file and possibly breaking build? > ffMatrix <- ff(vmode="boolean", dim=c(1e10,1e10)) Error in if (length < 0 || length > .Machine$integer.max) stop("length must be between 1 and .Machine$integer.max") : missing value where TRUE/FALSE needed In addition: Warning message: In ff(vmode = "boolean", dim = c(1e+10, 1e+10)) : NAs introduced by coercion > 1e+10 > .Machine$integer.max [1] TRUE

    Read the article

  • RijndaelManaged Padding when data matches block size

    - by trampster
    If I use PKCS7 padding in RijndaelManaged with 16 bytes of data then I get 32 bytes of data output. It appears that for PKCS7 when the data size matches the block size it adds a whole extra block of data. If I use Zeros padding for 16 bytes of data I get out 16 bytes of data. So for Zeros padding if the data matches the block size then it doesn't pad. I have searched through the documentation and it says nothing about this difference in padding behavior. Can someone please point me to some kind of documentation which specifies what the padding behavior should be for the different padding modes when the data size matches the block size.

    Read the article

  • Recommendations on Triming Large Amounts of Text from a DOM Object

    - by aronchick
    I'm doing some in browser editing, and I have some content that's on the order of around 20k characters long in a <pre>. So it looks something like: <pre> Text 1 Text 2 Text 3 Text 4 [...] Text 20,000 </pre> I'd like to use jquery to trim it down when someone hits a button to chop, but I'm having trouble doing it without overloading the browser. Assume I know that the character numbers are at 16,510 - 17,888, and what I'd like to do is trim it. I was using: jQuery('#textsection').html(jQuery('textarea').html().substr(range.start)); But browsers seem to enjoy crashing when I do this. Alternatives?

    Read the article

  • Using Large Lists

    - by cam
    In an Outlook AddIn I'm working on, I use a list to grab all the messages in the current folder, then process them, then save them. First, I create a list of all messages, then I create another list from the list of messages, then finally I create a third list of messages that need to be moved. Essentially, they are all copies of eachother, and I made it this way to organize it. Would it increase performance if I used only one list? I thought lists were just references to the actual item.

    Read the article

  • Efficiently finding the shortest path in large graphs

    - by Björn Lindqvist
    I'm looking to find a way to in real-time find the shortest path between nodes in a huge graph. It has hundreds of thousands of vertices and millions of edges. I know this question has been asked before and I guess the answer is to use a breadth-first search, but I'm more interested in to know what software you can use to implement it. For example, it would be totally perfect if it already exist a library (with python bindings!) for performing bfs in undirected graphs.

    Read the article

  • Caching Authentication Data

    - by PartlyCloudy
    Hi, I'm currently implementing a REST web service using CouchDB and RESTlet. The RESTlet layer is mainly for authentication and some minor filtering of the JSON data served by CouchDB: Clients <= HTTP = [ RESTlet <= HTTP = CouchDB ] I'm using CouchDB also to store user login data, because I don't want to add an additional database server for that purpose. Thus, each request to my service causes two CouchDB requests conducted by RESTlet (auth data + "real" request). In order to keep the service as efficent as possible, I want to reduce the number of requests, in this case redundant requests for login data. My idea now is to provide a cache (i.e.LRU-Cache via LinkedHashMap) within my RESTlet application that caches login data, because HTTP caching will probabily not be enough. But how do I invalidate the cache data, once a user changes the password, for instance. Thanks to REST, the application might run on several servers in parallel, and I don't want to create a central instance just to cache login data. Currently, I save requested auth data in the cache and try to auth new requests by using them. If a authentication fails or there is now entry available, I'll dispatch a GET request to my CouchDB storage in order to obtain the actual auth data. So in a worst case, users that have changed their data will perhaps still be able to login with their old credentials. How can I deal with that? Or what is a good strategy to keep the cache(s) up-to-date in general? Thanks in advance.

    Read the article

  • Large Scale VHDL techniques

    - by oxinabox.ucc.asn.au
    I'm thinking about implimenting a 16 bit CPU in VHDL. A simplish CPU. ADD, MULS, NEG, BitShift, JUMP, Relitive Jump, BREQ, Relitive BREQ, i don't know somethign along these lines Probably all only working with 16bit operands. I might even cut it down and use only a single operand and a accumulator. With Some status regitsters, Carry, Zero, Neg (unless i use a Accumlator), I know how to design all the parts from logic gates, and plan to build them up from first priciples, So for my ALU I'll need to 'build' a ADDer, proably a Carry Look ahead, group adder, this adder it self is make up oa a couple of parts, wich are themselves made up of a couple of parts. Anyway, my problem is not the CPU design, or the VHDL (i know the language, more or less). It's how i should keep things organised. How should I use packages, How should I name my processes and port maps? (i've never seen the benifit of naming the port maps, or processes)

    Read the article

  • How to open a large text file in C#

    - by desmati
    I have a text file that contains about 100000 articles. The structure of file is: BEGIN OF FILE .Document ID 42944-YEAR:5 .Date 03\08\11 .Cat political Article Content 1 .Document ID 42945-YEAR:5 .Date 03\08\11 .Cat political Article Content 2 END OF FILE I want to open this file in c# for processing it line by line. I tried this code: String[] FileLines = File.ReadAllText(TB_SourceFile.Text).Split(Environment.NewLine.ToCharArray()); But it says: Exception of type 'System.OutOfMemoryException' was thrown. The question is How can I open this file and read it line by line. File Size: 564 MB (591,886,626 bytes) File Encoding: UTF-8 File contains Unicode characters.

    Read the article

  • What is the largest file size we can transfer through air application?

    - by Naveen kumar
    Hi all, I'm trying to transfer large file(1Gb+) using UDP(in packets) through air application. I'm transfering byteArray by taking chunks of packets from FileStream. But its giving 'Error #1000: The system is out of memory' at sender side after certain number of packets sent and by this time the downloaded file size at server side is 256 MB. I tried with other files but after downloading 256MB, sender is giving the same error. Is it because of the file stream size? How can I solve this problem so that I can transfer files of GB size.

    Read the article

  • handling large arrays with array_diff

    - by bigmac
    I have been trying to compare two arrays. Using array_intersect presents no problems. When using array_diff and arrays with ~5,000 values, it works. When I get to ~10,000 values, the script dies when I get to array_diff. Turning on error_reporting did not produce anything. I tried creating my own array_diff function: function manual_array_diff($arraya, $arrayb) { foreach ($arraya as $keya => $valuea) { if (in_array($valuea, $arrayb)) { unset($arraya[$keya]); } } return $arraya; } source: http://stackoverflow.com/questions/2479963/how-does-array-diff-work I would expect it to be less efficient that than the official array_diff, but it can handle arrays of ~10,000. Unfortunately, both array_diffs fail when I get to ~15,000. I tried the same code on a different machine and it runs fine, so it's not an issue with the code or PHP. There must be some limit set somewhere on that particular server. Any idea how I can get around that limit or alter it or just find out what it is?

    Read the article

  • changing user in ubuntu

    - by Rahul Mehta
    Hi , this is my ls -all, the zfapi folder have the root right , how can i change this to www-data. Also Please advise what is the first root and secont root is ? Thanks drwxr-xr-x 4 www-data www-data 4096 2011-01-06 18:21 cdnapi -rw-r--r-- 1 www-data www-data 678 2010-08-30 12:02 config.js drwxr-xr-x 4 www-data www-data 4096 2010-11-23 15:55 css drwxr-xr-x 7 www-data www-data 4096 2010-11-17 13:12 images -rw-r--r-- 1 www-data www-data 25064 2010-12-17 18:26 index.html -rw-r--r-- 1 www-data www-data 19830 2010-12-18 11:24 init.js drwxr-xr-x 2 www-data www-data 4096 2010-12-02 12:34 lib -rw-r--r-- 1 www-data www-data 18758 2010-12-06 18:00 styles.css -rw-r--r-- 1 www-data www-data 1081 2010-10-21 17:56 testbganim.html drwxr-xr-x 2 www-data www-data 4096 2010-12-17 11:15 yapi drwxr-xr-x 7 root root 4096 2011-01-07 18:20 zfapi

    Read the article

  • AIR: sync gui with data-base?

    - by John Isaacks
    I am going to be building an AIR application that shows a list (about 1-25 rows of data) from a data-base. The data-base is on the web. I want the list to be as accurate as possible, meaning as soon as the data-base data changes, the list displayed in the app should update asap. I do not know of anyway that the air application could be notified when there is a change, I am thinking I am going to have to poll the data-base at certain intervals to keep an up to date list. So my question is, first is there any way to NOT have to keep checking the data-base? or if I do keep have to keep checking the data-base what is a reasonable interval to do that at? Thanks.

    Read the article

  • How do you handle large repeated UI elements with JQuery

    - by jpoz
    Howdy, Here's the situation: You have a very complex UI element that is repeated in a list. Each has a menu on it, buttons, it hides and shows subelements, buttons for switch it's state, etc, etc. The elements are populated via JSON so you have to construct the elements and the functionality of the fly. What's the best way to accomplish this with JQuery? Where would you save the reusable template for the DOM structure? How would you add the behavior on? $().live? .livequery? onclick? manual after every JSON get? I guess I just see a lot of people doing different things. What's your experience with performance? Any insight would be much appreciated. Thanks, JPoz

    Read the article

< Previous Page | 163 164 165 166 167 168 169 170 171 172 173 174  | Next Page >