Search Results

Search found 91480 results on 3660 pages for 'large data in sharepoint list'.

Page 270/3660 | < Previous Page | 266 267 268 269 270 271 272 273 274 275 276 277  | Next Page >

  • How do I recover my accidentally lost Windows partitions after installing Ubuntu?

    - by Totally newbie
    I have a Toshiba satellite A-200 laptop with a Vista OS on it with 4 NTFS partitions (C:) Vista (D:) Entertainment (E:) Work (F:) Sources and I wanted to start using Ubuntu instead. So I tried it first from the live CD and everything was OK and all the partitions were shown and working and so I decided to install Ubuntu to replace Vista on the (C:) drive. After I did that I can no longer find my folders and files on the (D:), (E:), (F:) partitions and the only file system that is shown is one 198 GB although my HDD is 320 GB. I can't access the lost data on the remaining 120 GB which I hope is still there and not totally lost I am now working from the live CD but I am unable to install testdisk. Can I recover the Vista partitions by the product recovery CD to get my laptop back to the factory settings? Can I recover the NTFS partitions using a recovery program for Windows or will that make the problem worse? I need these data badly as I don't have a backup for them.

    Read the article

  • Organizing your Data Access Layer

    - by nighthawk457
    I am using Entity Framework as my ORM in an ASP.Net application. I have my database already created so ended up generating the entity model from it. What is a good way to organize files/classes in the data access layer. My entity framework model is in a class library and I was planning on adding additional classes per Entity(i.e per database table) and putting all the queries related to those tables in their respective classes. I am not sure if this is a right approach and if it is then where do the queries requiring data from multiple tables go? Am I completely wrong in organizing my files based on entities/tables and should I organize them based on functional areas instead.

    Read the article

  • Ubuntu crashed during the update to 12.04, not I can't recover my files, help please

    - by mrah
    I'm pretty new to Linux and I've only installed Ubuntu as I couldn't afford to buy Windows, worked well and I liked it. But I chose to upgrade it to the newest version after a prompt. The update froze and the machine was unresponsive which forced me to hard reboot it. Now nothing seems to load and I've reached my wits end (mainly cos I'm lost in all the command lines). I've decided to try and recover my data from the hard drive, only two folders, by selecting the try Ubuntu option when I insert the OS CD into the machine. The problem I'm experiencing now is it won't let me copy my folders, I get a 'The folder contents could not be displayed. You do not have the permissions necessary to view the contents of "folder_name".' Does anyone know how I can recover this data?

    Read the article

  • How much time do you need in between large projects?

    - by Mattio
    You've launched a large project at work, something that's been in progress and taken up large chunks of your life for more than 6 months. The post-launch triage is over. Tech support isn't calling you every hour because they don't know how to troubleshoot an issue. Your hours drop from 60+/wk to whatever is normal in your organization (which is hopefully less than 60+!). How much time do you (or your team) need before the next large project begins? I was asked this question at work and I think the ideal minimum is two weeks -- one week to clear your desk and inbox + one week to clear your head and remember what it's like to have a life outside of work. I'd frankly acknowledge that just being asked this question is a huge boon to work/life balance. But I do think it's possible to go too long in between.

    Read the article

  • android application that visualize real time data

    - by matarsak
    I want to build and android app that visualize real time data , I set up a UDP channel that get the data , now I want to visualize it . I now that I can use openGL ES. but I have now back ground and in a few weeks I dont think that i'm able to learn that . what about android processing ? could it be used for extensive visualization task like this? or it's limited ? I heard it's not hard to learn that. any other option ?

    Read the article

  • Partition Hard Drive For Data

    - by user211779
    Greetings ~ I am a new Linux/Ubuntu user. For various reasons (mostly my own ignorance) I am on my third install of Ubuntu 12.04. I want to partition the hard drive to create a drive for data and personal files in case I ever have to install again. I have been struggling all afternoon to make a gparted live USB. Tuxboot looked like the answer but I get an error message when using it. So, I am asking for help. Ultimately, I want to partition the hard drive for data and personal files. What do you recommend?

    Read the article

  • Tuning Distributed Applications to Access Big Data

    Distributed applications are just that: distributed across one or more hardware platforms across the enterprise. The database administrator (DBA) has the unenviable task of monitoring these environments and configuring and tuning the database server to meet multiple needs. As multiple distributed applications now require access to a very large data store, what tuning options are available to help? Get your SQL Server database under version control now!Version control is standard for applications, but databases haven’t caught up. So how can you bring database development up to speed? Why should you start? Find out…

    Read the article

  • How to bind Data to Dropdownlist in Kendo Ui Mobile

    - by dinesh Haraveer
    I have been using Kendo Mobile to develop an application, previously same application i have done in Kendo web,it's works fine.The main problem is that i have to bind data to two dropdownlist which the below code i have written,when my application is running it show an error like "Microsoft JScript runtime error: Object doesn't support property or method 'append'". in HTML <div id="forms" data-role="view" data-title="Form Elements" data-init="initForm"> <table> <tr> <td> <label style="margin-left: 20px"> Company:</label> </td> <td> <select id="ddlCompany" style="width: 200px"> <option>Select Company</option> </select> </td> <td class="style1"> <label style="margin-left: 20px"> Category:</label> </td> <td> <select id="ddlCategory" style="width: 200px"> <option>Select Category</option> </select> </td> <td> <label style="margin-left: 20px"> Product :</label> </td> <td> <select id="ddlProduct" style="width: 200px"> <option>Select Product</option> </select> </td> </tr> </table> </div> function initForm() { $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", url: "FlashReportMobileWebService.asmx/GetCompany", dataType: "json", success: function (data) { for (i = 0; i < data.d.length; i++) { ddlCompany.append($("<option></option>").val(data.d[i].Company).html(data.d[i].Company)); }; $("#ddlCompany").kendoDropDownList(); } }); $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", url: "FlashReportMobileWebService.asmx/ToCategoryDropDown", dataType: "json", success: function (data) { for (i = 0; i < data.d.length; i++) { ddlCategory.append($("<option></option>").val(data.d[i].Category).html(data.d[i].Category)); }; $("#ddlCategory").kendoDropDownList(); }, failure: function (msg) { alert(msg); } }); } $("#ddlCategory").change( function (e) { var ddlProduct= $("#ddlProduct"); var dataItem = $("#ddlCategory").val(); $.ajax({ type: "POST", contentType: "application/json; charset=utf-8", data: "{'Category':'" + dataItem + "'}", url: "FlashReportWebService.asmx/ToFillProductDropDown", dataType: "json", success: function (data) { ddlProduct.empty(); for (i = 0; i < data.d.length; i++) { ddlProduct.append($("<option></option>").val(data.d[i].ProductName).html(data.d[i].ProductName)); }; $("#ddlProduct").kendoDropDownList(); }, failure: function (msg) { alert(msg); } }); }); var app = new kendo.mobile.Application(document.body); thanks for reading this

    Read the article

  • solution for RPC_E_ATTEMPTED_MULTITHREAD error caused by SPRequestContext caching SPSites?

    - by kerray
    Hi, I'm developing a solution for SharePoint 2007, and I'm using SPSecurity.RunWithElevatedPrivileges a lot, passing in UserToken of the SystemAccount. After reading http://hristopavlov.wordpress.com/2009/01/19/understanding-sharepoint-sprequest/ I finally began to understand why I get these System.Runtime.InteropServices.COMException (0x80010102): Attempted to make calls on more than one thread in single threaded mode. (Exception from HRESULT: 0x80010102 (RPC_E_ATTEMPTED_MULTITHREAD)) errors, but there seems to be no solution - "known issue in the product" The article is more then a year old. I wasn't able to find anything more recent and helpful, but I was hoping maybe someone else has? My code goes like this SPSecurity.RunWithElevatedPrivileges(delegate() { using (SPSite elevatedSite = new SPSite(web.Site.ID, web.Site.SystemAccount.UserToken)) { using (SPWeb elevatedWeb = elevatedSite.OpenWeb(web.ID)) { // some operations on lists and items obtained through elevatedWeb } } } The errors come up wherever such an elevated code is used, and more often when there are more users who use these functionalities, so I guess perhaps the elevated SPSite is getting cached and reused. Is there any way to solve this? If my understanding is correct, how to make Sharepoint forget about the cached SPSites, and use a fresh one instead? Thanks

    Read the article

  • uploading large xml to WCF REST service -> 400 Bad request

    - by glenn.danthi
    I am trying to upload large xml files to a REST service... I have tried almost all methods specified on stackoverflow on google but I still cant find out where I am going wrong....I cannot upload a file greater than 64 kb!.. I have specified the maxRequestLength : <httpRuntime maxRequestLength="65536"/> and my binding config is as follows : <bindings> <webHttpBinding> <binding name="RESTBinding" maxBufferSize="67108864" maxReceivedMessageSize="67108864" openTimeout="00:10:00" receiveTimeout="00:10:00" sendTimeout="00:10:00"> <readerQuotas maxDepth="2147483647" maxStringContentLength="2147483647" maxArrayLength="2147483647" maxBytesPerRead="2147483647" maxNameTableCharCount="2147483647"/> </binding> </webHttpBinding> </bindings> In my C# client side I am doing the following : WebRequest request = HttpWebRequest.Create(@"http://localhost.:2381/RepositoryServices.svc/deviceprofile/AddDdxml"); request.Credentials = new NetworkCredential("blah", "blah"); request.Method = "POST"; request.ContentType = "application/xml"; request.ContentLength = byteArray.LongLength; using (Stream postStream = request.GetRequestStream()) { postStream.Write(byteArray, 0, byteArray.Length); } There is no special configuration done on the client side...

    Read the article

  • Ext.data.Store, Javascript Arrays and Ext.grid.ColumnModel

    - by Michael Wales
    I am using Ext.data.Store to call a PHP script which returns a JSON response with some metadata about fields that will be used in a query (unique name, table, field, and user-friendly title). I then loop through each of the Ext.data.Record objects, placing the data I need into an array (this_column), push that array onto the end of another array (columns), and eventually pass this to an Ext.grid.ColumnModel object. The problem I am having is - no matter which query I am testing against (I have a number of them, varying in size and complexity), the columns array always works as expected up to columns[15]. At columns[16], all indexes from that point and previous are filled with the value of columns[15]. This behavior continues until the loop reaches the end of the Ext.data.Store object, when the entire arrays consists of the same value. Here's some code: columns = []; this_column = []; var MetaData = Ext.data.Record.create([ {name: 'id'}, {name: 'table'}, {name: 'field'}, {name: 'title'} ]); // Query the server for metadata for the query we're about to run metaDataStore = new Ext.data.Store({ autoLoad: true, reader: new Ext.data.JsonReader({ totalProperty: 'results', root: 'fields', id: 'id' }, MetaData), proxy: new Ext.data.HttpProxy({ url: 'index.php/' + type + '/' + slug }), listeners: { 'load': function () { metaDataStore.each(function(r) { this_column['id'] = r.data['id']; this_column['header'] = r.data['title']; this_column['sortable'] = true; this_column['dataIndex'] = r.data['table'] + '.' + r.data['field']; // This display valid information, through the entire process console.info(this_column['id'] + ' : ' + this_column['header'] + ' : ' + this_column['sortable'] + ' : ' + this_column['dataIndex']); columns.push(this_column); }); // This goes nuts at columns[15] console.info(columns); gridColModel = new Ext.grid.ColumnModel({ columns: columns });

    Read the article

  • iPhone Core Data Lightweight Migration error: reason = "Can't find model for source store";

    - by tul697
    Steps taken: 1. Added Data Model version: Changed my XXX.xcdatamodel to XXX.xcdatamodeId with Design - Data Model - Add Model Version. Set the new XXX 2.xcdatamodel as current version Added an attribute to XXX 2.xcdatamodel Added NSMigratePersistentStoresAutomaticallyOption and NSInferMappingModelAutomaticallyOption like most tutorials, I added the option in the addPersistentStoreWithType. ran the code and I got this error: Unresolved error Error Domain=NSCocoaErrorDomain Code=134130 UserInfo=0x146bb80 "Operation could not be completed. (Cocoa error 134130.)", { URL = file://localhost/Users/tleung/Library/Application%20Support/iPhone%20Simulator/3.0/Applications/B585CDFC-17C3-4A44-84E2-0B75893C46B8/Documents/favorites.sqlite; metadata = { NSPersistenceFrameworkVersion = 241; NSStoreModelVersionHashes = { City = <70ea1f9f aaa9af29 52d2bfe4 3071d97f 8224f765 d69928d5 e5844120 52742a35; StationStore = <40d8093a 1d7d00ec 178b4374 36dfc137 ccfa3a88 87e2d467 69e8ae7e d4c49dbb; }; NSStoreModelVersionHashesVersion = 3; NSStoreModelVersionIdentifiers = ( ); NSStoreType = SQLite; NSStoreUUID = "9DD342A6-1F68-4997-A097-096DC96D7BF3"; }; reason = "Can't find model for source store"; } I've also tried NSString *path = [[NSBundle mainBundle] pathForResource:@"YOURDB" ofType:@"momd"]; NSURL *momURL = [NSURL fileURLWithPath:path]; managedObjectModel = [[NSManagedObjectModel alloc] initWithContentsOfURL:momURL]; as suggested by other posts with no success. It seems that it can't find ANY of my models... anyone have any idea?

    Read the article

  • C# average function for large numbers without overflow exception

    - by Ron Klein
    .NET Framework 3.5. I'm trying to calculate the average of some pretty large numbers. For instance: using System; using System.Linq; class Program { static void Main(string[] args) { var items = new long[] { long.MaxValue - 100, long.MaxValue - 200, long.MaxValue - 300 }; try { var avg = items.Average(); Console.WriteLine(avg); } catch (OverflowException ex) { Console.WriteLine("can't calculate that!"); } Console.ReadLine(); } } Obviously, the mathematical result is 9223372036854775607 (long.MaxValue - 200), but I get an exception there. This is because the implementation (on my machine) to the Average extension method, as inspected by .NET Reflector is: public static double Average(this IEnumerable<long> source) { if (source == null) { throw Error.ArgumentNull("source"); } long num = 0L; long num2 = 0L; foreach (long num3 in source) { num += num3; num2 += 1L; } if (num2 <= 0L) { throw Error.NoElements(); } return (((double) num) / ((double) num2)); } I know I can use a BigInt library (yes, I know that it is included in .NET Framework 4.0, but I'm tied to 3.5). But I still wonder if there's a pretty straight forward implementation of calculating the average of integers without an external library. Do you happen to know about such implementation? Thanks!!

    Read the article

  • Structuring projects & dependencies of large winforms applications in C#

    - by Benjol
    UPDATE: This is one of my most-visited questions, and yet I still haven't really found a satisfactory solution for my project. One idea I read in an answer to another question is to create a tool which can build solutions 'on the fly' for projects that you pick from a list. I have yet to try that though. How do you structure a very large application? Multiple smallish projects/assemblies in one big solution? A few big projects? One solution per project? And how do you manage dependencies in the case where you don't have one solution. Note: I'm looking for advice based on experience, not answers you found on Google (I can do that myself). I'm currently working on an application which has upward of 80 dlls, each in its own solution. Managing the dependencies is almost a full time job. There is a custom in-house 'source control' with added functionality for copying dependency dlls all over the place. Seems like a sub-optimum solution to me, but is there a better way? Working on a solution with 80 projects would be pretty rough in practice, I fear. (Context: winforms, not web) EDIT: (If you think this is a different question, leave me a comment) It seems to me that there are interdependencies between: Project/Solution structure for an application Folder/File structure Branch structure for source control (if you use branching) But I have great difficulty separating these out to consider them individually, if that is even possible. I have asked another related question here.

    Read the article

  • Working with a large data object between ruby processes

    - by Gdeglin
    I have a Ruby hash that reaches approximately 10 megabytes if written to a file using Marshal.dump. After gzip compression it is approximately 500 kilobytes. Iterating through and altering this hash is very fast in ruby (fractions of a millisecond). Even copying it is extremely fast. The problem is that I need to share the data in this hash between Ruby on Rails processes. In order to do this using the Rails cache (file_store or memcached) I need to Marshal.dump the file first, however this incurs a 1000 millisecond delay when serializing the file and a 400 millisecond delay when serializing it. Ideally I would want to be able to save and load this hash from each process in under 100 milliseconds. One idea is to spawn a new Ruby process to hold this hash that provides an API to the other processes to modify or process the data within it, but I want to avoid doing this unless I'm certain that there are no other ways to share this object quickly. Is there a way I can more directly share this hash between processes without needing to serialize or deserialize it? Here is the code I'm using to generate a hash similar to the one I'm working with: @a = [] 0.upto(500) do |r| @a[r] = [] 0.upto(10_000) do |c| if rand(10) == 0 @a[r][c] = 1 # 10% chance of being 1 else @a[r][c] = 0 end end end @c = Marshal.dump(@a) # 1000 milliseconds Marshal.load(@c) # 400 milliseconds Update: Since my original question did not receive many responses, I'm assuming there's no solution as easy as I would have hoped. Presently I'm considering two options: Create a Sinatra application to store this hash with an API to modify/access it. Create a C application to do the same as #1, but a lot faster. The scope of my problem has increased such that the hash may be larger than my original example. So #2 may be necessary. But I have no idea where to start in terms of writing a C application that exposes an appropriate API. A good walkthrough through how best to implement #1 or #2 may receive best answer credit.

    Read the article

< Previous Page | 266 267 268 269 270 271 272 273 274 275 276 277  | Next Page >