Search Results

Search found 91480 results on 3660 pages for 'large data in sharepoint list'.

Page 260/3660 | < Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >

  • How to transfer large files from desktop to server ( .NET)

    - by rahulchandran
    I am writing a .NET 2.0 based desktop client that will send large files ( well largish under 2GB) to a server. Need to develop the server as well. Server can be on any technology It should be secure so an underlying SSL stream is needed What are my options. Any obvious caveats etc I should be aware of To my mind the simplest solution is to open a tcp\ip connection over SSL to the server and send n packets each of size M bytes and then have the server append the chunks to the file and finally send an EOF packet as well IS this horrible. Will the perf suck on the server with all these disk writes What are any other clever options. I am limited to .NET 2.0 on the client if I did move to a WCF client will it buy be something magical and cool for this scenario Thanks

    Read the article

  • slow record deletion with large ntext values

    - by asking
    I'm having trouble deleting some records via a stored procedure from a table in SQLServer 2008R2 that has ntext columns. The stored proc is timing out and running the query directly takes a very long time. The initial query was a straight "delete from y where x = z" and I've also tried running it in batches of 1000 with transactions but it is still slow and timing out in a stored proc. The majority of the records in the table will not be deleted each time (it's not just a once-off query but will be run other times). The ntext columns are not used in the where clause and I can't change the column types. Any suggestions on the quickest way to delete records with large ntext values? Thanks

    Read the article

  • A programming language for teaching data structures and algorithms with? [closed]

    - by Andreas Grech
    Possible Duplicate: Choice of programming language for learning data structures and algorithms Teachers have different opinions on what programming language they would choose to teach data structures and algorithms with. Some would prefer a lower level language such as C because it allows the student to learn more about what goes on beyond the abstractions in terms of memory allocation and deallocation and pointers and pointer arithmetic. On the other hand, others would say that they would prefer a higher level language like Java because it allows the student to learn more about the concepts of the structures and the algorithm design rather than 'waste time' and fiddle around with memory segmentation faults and all the blunders that come with languages where memory management is manual. What is your take on this issue? And also, please post any references you may know of that also discuss this argument.

    Read the article

  • Paging a UIScrollView with a large PDF

    - by Fousa
    I try to create a simple UIScrollView with paging. And I want to be able to scroll through a large PDF document, but this gives me some problems... I tried the following options: Convert all the PDF pages to UIImages at startup, this works, but is very slow on start Manually drawing the PDF page in the drawRect, but yet again this was slow... And I prefer not to load everything at startup but to do it during the usage. Did anyone did this recently? Can't seem to find a nice example project. Thnx! Jelle

    Read the article

  • Mysql Master Slave Replication on Large Database table (how to sync initial data)

    - by Brian Lovett
    We have a production server and a dev server. We have found that backups are nearly impossible on the production server because of the query volume we experience. So, we're looking at setting up replication with our dev server being the slave. This is ideal because we can afford to lock the tables on that server and additionally it will be nice to have up to date data for the developers. Now, the issues. The production server can't really be taken down or locked at this point, at least not easily. We have a high query volume and fairly large 30+ GB innodb tables. Both servers are running all innodb and are also both on mysql 5.1. What can we do to sync the data initially to get replication started? I've tried a few options, but so far, none have worked.

    Read the article

  • Getting HIERARCHY_REQUEST_ERR when using Javascript to recursively generate a nested list

    - by Mark
    I have a method that is trying to take in a list. This list can contain data and other lists. The end goal is to try to convert something like this ["a", "b", ["c", "d"]] into <ol> <li> <b>a</a> </li> <li> <b>b</a> </li> <ol> <li> <b>c</a> </li> <li> <b>d</a> </li> </ol> </ol> The code is: function $(tagName) { return document.createElement(tagName); } //returns an html element representing data //data should be an array or some sort of value function tagMaker(data) { tag = null; if(data instanceof Array) { //data is an array, represent using <ol> tag = $("ol"); for(i=0; i<data.length; i++) { //construct one <li> for each item in the array listItem = $("li"); //get the html element representing this particular item in the array child = tagMaker(data[i]); //<li>*html for child*</li> listItem.appendChild(child); //add this item to the list tag.appendChild(listItem); } } else { //data is not an array, represent using <b>data</b> tag = $("b"); tag.innerHTML = data.toString(); } return tag; } Calling tagMaker throws HIERARCHY_REQUEST_ERR: DOM Exception 3, rather than generating a helpful HTML element object which I was planning to append to document.body.

    Read the article

  • namespacing large javascript like jquery

    - by frenchie
    I have a very large javascript file: it's over 9,000 lines. The code looks like this: var GlobalVar1 = ""; var GlobalVar2 = null; function A() {...} function B(SomeParameter) {...} I'm using the google compiler and the global variables and functions get renamed a,b,c... and there's a good change that there might be some collision later with some outside code. What I want to do is have my code organized like the jquery library where everything is accessible with $. Is there a way to namespace my code so that everything is behind a # character for example. I'd like to have this to call my code: #.GlobalVar #.functionA(SomeParameter) How can I do this? Thanks.

    Read the article

  • Unable to return large result set ORA-22814

    - by rvenugopal
    Hello All I am encountering an issue when I am trying to load a large result set using a range query in Oracle 10g. When I try a smaller range (1 to 100), it works but when I try a larger range(1 and 1000), I get the following error "ORA-22814: attribute or element value is larger than specified in type" error. I have a basic UDT (PostComments_Type) and I have tried using both a VArray and a Table type of PostComments_Type but that hasn't made a difference. Your help is appreciated --Thanks Venu PROCEDURE RangeLoad ( floorId IN NUMBER, ceilingId IN NUMBER, o_PostComments_LARGE_COLL_TYPE OUT PostComments_LARGE_COLL_TYPE -- Tried using as VArray and also Table type of PostComments_Type )IS BEGIN SELECT PostComments_TYPE ( PostComments_ID, ... ) BULK COLLECT INTO o_PostComments_LARGE_COLL_TYPE ------------This is for VARRAY/Table Type. So bulk operation FROM PostComments WHERE PostComments_ID BETWEEN floorId And ceilingId; END RangeLoad;

    Read the article

  • Processing large recordsets in Rails

    - by japancheese
    Hello, I'm trying to perform a daily operation on a larger than normal dataset (2m+ records). However, Rails seems to take a very long time performing operations on such a dataset. Operations like Dataset.all.each do |data| ... end take a very long time to complete (I assume this is because it can't fit all the items into memory at once, right?). Does anyone have any strategies on how I could handle this situation? I know SQL would probably speed up the process, but I'm looking to use the Rails environment as I can do many more complicated things to the data than I can with just SQL statements.

    Read the article

  • How to retrieve large data from oracle database using vbscript

    - by allenzzzxd
    Hi guys, I'm now working on vbscript to do some test. Actuelly, I want to retrieve a large amount of data from an oracle database, so I write the code like this: sql = "Select * from CORE_DB where MC = '" & mstr & "' " Set myrs = db_execute_query(curConnection, sql) Then I count the rows in myrs,there are 248 rows. So then I do a For loop to retrieve some fields of each row. For k = 0 To db_get_rows_count(myrs) But then I found that the content of the row k when k 133 was always equal to k = 133. So this makes an error. As I think, there may be a limit size of mrys ? Could anyone light me about this? Thanks a lot in advance

    Read the article

  • Transforming large Xml files

    - by Chad
    I was using this extension method to transform very large xml files with an xslt. Unfortunately, I get an OutOfMemoryException on the source.ToString() line. I realize there must be a better way, I'm just not sure what that would be? public static XElement Transform(this XElement source, string xslPath, XsltArgumentList arguments) { var doc = new XmlDocument(); doc.LoadXml(source.ToString()); var xsl = new XslCompiledTransform(); xsl.Load(xslPath); using (var swDocument = new StringWriter(System.Globalization.CultureInfo.InvariantCulture)) { using (var xtw = new XmlTextWriter(swDocument)) { xsl.Transform((doc.CreateNavigator()), arguments, xtw); xtw.Flush(); return XElement.Parse(swDocument.ToString()); } } } Thoughts? Solutions? Etc.

    Read the article

  • Are there version control systems that allow you to permanently delete files?

    - by Andrea Francia
    I need to keep under version some large files (some Gigs). I don't need, and I can't keep under version all the version of the files. I want to be able to remove from my VCS large files version at some moment. What control version system could I use? EDIT: The files that I want to keep under version control are big .zip files or ISO images. These files may contains executable software or data (seismic data, SAR images, GNSS data) and they are provided by the software supplier of my company.

    Read the article

  • how to restore very large .bak file (180 GB) in SQL Server 2008

    - by Umutos
    Hello! I have a verly large .bak file (180 GB) which was stored in Microsoft SQL Server 2008 and I have to restore it. I first installed Microsoft SQL Server 2008 Express and tried to restore it in MS SQL management studio express but it didn't work because there is a size limit. Does anybody know a method how i can restore the file? Its the first time I work with Microsoft SQL and I have no clue what to do. Its really urgend and I would be really helpful for any help! Thanks a lot! Umutos

    Read the article

  • Start diving into large open source projetcs

    - by Vanangamudi
    How to start learning and reading the source of large and complex projects like Blender3D and Gimp, for instance. Since the developers busy improving it and there is no docs exist at present, how do we start developing and customizing these projects. Linux kernel deserve to have several books on its code, also these kind of project do deserve the same. And there are no unit tests available for this kind of projects. Say I'm going to read and understand the source code blender. How do I start. How to setup the development environment for developing the app? If it includes several dependencies, and assume that their source code also available how to setup this kind of inter-related, coherent source code to debug?

    Read the article

  • Region or ItemsSource for large data set in ListBox

    - by Ryan
    I'm having trouble figuring out what the best solution is given the following situation. I'm using Prism 4.1, MEF, and .Net 4.0. I have an object Project that could have a large number (~1000) of Line objects. I'm deciding whether it is better to expose an ObservableCollection<LineViewModel> from my ProjectViewModel and manually create the Line viewmodels there OR set the ListBox as it's own region and activate views that way. I'd still want my LineViewModel to have Prism's shared services (IEventAggregator, etc.) injected, but I don't know how to do that when I manually create the LineViewModel. Any suggestions or thoughts?

    Read the article

  • Large number of tables and Hibernate memory consumption

    - by Vedran
    I'm working on a large ERP project which has database model with about 2100 tables. With "only" 500 tables mapped with Hibernate, application deployed on the web server takes about 3GB of working memory. Is there any way to reduce Hibernate's metamodel memory footprint when using that many tables in one persistence unit? Or should I just give up on ORMs and go with plain old JDBC (or even jOOQ)? Right now I'm using Hibernate 4.1.8, Spring 3.1.3, JBoss AS 7.1 and working with MSSQL database. Edit: JavaMelody memory histogram output - with 2000 generated test tables that are a bit smaller in scope from the original db model (hence 'only' 1.3GB of spent memory)

    Read the article

  • Saving core data in a thread, how to ensure its done writing before quitting?

    - by Shizam
    So I'm saving small images to core data which take a really short amount of time to save, like .2 seconds but I'm doing it while the user is flipping through a scroll view so in order to improve responsiveness I'm moving the saving to a thread. This works great, everything gets saved and the app is responsive. However, there is one thing in the core-data + multithreading doco that worries me: "In Cocoa, only the main thread is not-detached. If you need to save on other threads, you must write additional code such that the main thread prevents the application from quitting until all the save operation is complete." Ok, how do you do that? It only needs to last ~ .2 seconds and its rarely going to happen since the chance of the app quitting as something is saving is very low. How do I run something on the main thread that'll prevent the app from quitting AND not block the gui? Thanks

    Read the article

  • Locking DB w/ Large Reads (Ruby-on-Rails/Heroku)

    - by Splashlin
    Currently I have a Web API running on Heroku that is constantly writing information we're collecting from other data sources (currently theres about half a GB of data and it's growing very quickly). We're looking to add a reporting system on top of the current database that we can use to extract useful information out of the DB. The problem is that when we're running reports we're locking the DB and any other sites communicating with the DB are timing out. Does anyone have any solutions on how to solve this type of issue? Amazon RDS seems to have some interesting stuff with database replication but I don't know if that will solve my problems. Any advice would be greatly appreciated. Thanks

    Read the article

  • Need advice on cron job'ing a very large process

    - by Arms
    I have a PHP script that grabs data from an external service and saves data to my database. I need this script to run once every minute for every user in the system (of which I expect to be thousands). My question is, what's the most efficient way to run this per user, per minute? At first I thought I would have a function that grabs all the user Ids from my database, iterate over the ids and perform the task for each one, but I think that as the number of users grow, this will take longer, and no longer fall within 1 minute intervals. Perhaps I should queue the user Ids, and perform the task individually for each one? In which case, I'm actually unsure of how to proceed. Thanks in advance for any advice.

    Read the article

  • jquery, position 'close icon' div in the top right of a large image

    - by Blankman
    My webpage has a large image (a map). I want to position, at the top right of the image, a small icon that is for closing the map. How can I figure out the position and place the image appropriately? The image has a fixed width of 900 x 600. I have jquery on the page if that helps. I tried using $("#map").position and I have the top and left, but not sure how to position to the top right.

    Read the article

  • iphone application development -- passing data to and from the server.

    - by SAPNA
    i have to develop an -phone application .user logs in through the i-phone and gets data stored in the database.our database is created in MY SQL. and website is developed in (classic) ASP.interface is created in i-phone SDK. connection is remaining.what should i use for transferring data to server and from the server. JSON or SOAP.is XML parsing necessary.actually i am very new to this field. so a bit confused. we have some time left for completing our application.so in urgent need of help. thank you in advance.

    Read the article

  • Is there any memory restrictions on an ASP.Net application? HttpHandler?

    - by tpower
    I have an ASP.Net MVC application that allows users to upload images. When I try to upload a really large file (400MB) I get an error. I assumed that my image processing code (home brew) was very inefficient, so I decided I would try using a third party library to handle the image processing parts. Because I'm using TDD, I wanted to first write a test that fails. But when I test the controller action with the same large file it is able to do all the image processing without any trouble. The error I get is "Out of memory". I'm sure my code is probably using a lot more memory than it needs to but I just want to know why my test passes. The other difference is that I'm using SWFUpload which is not used with the test. Could this be the cause?

    Read the article

< Previous Page | 256 257 258 259 260 261 262 263 264 265 266 267  | Next Page >