Search Results

Search found 65558 results on 2623 pages for 'large data'.

Page 224/2623 | < Previous Page | 220 221 222 223 224 225 226 227 228 229 230 231  | Next Page >

  • Are there any specific workflows or design patterns that are commonly used to create large functional programming applications?

    - by Andrew
    I have been exploring Clojure for a while now, although I haven't used it on any nontrivial projects. Basically, I have just been getting comfortable with the syntax and some of the idioms. Coming from an OOP background, with Clojure being the first functional language that I have looked very much into, I'm naturally not as comfortable with the functional way of doing things. That said, are there any specific workflows or design patterns that are common with creating large functional applications? I'd really like to start using functional programming "for real", but I'm afraid that with my current lack of expertise, it would result in an epic fail. The "Gang of Four" is such a standard for OO programmers, but is there anything similar that is more directed at the functional paradigm? Most of the resources that I have found have great programming nuggets, but they don't step back to give a broader, more architectural look.

    Read the article

  • Ruby on Rails: Accessing production database data for testing

    - by williamjones
    With Ruby on Rails, is there a way for me to dump my production database into a form that the test part of Rails can access? I'm thinking either a way to turn the production database into fixtures, or else a way to migrate data from the production database into the test database that will not get routinely cleared out by Rails. I'd like to use this data for a variety of tests, but foremost in my mind is using real data with the performance tests, so that I can get a realistic understanding of load times.

    Read the article

  • jQuery UI Autocomplete plug-in pass in optional data for ajax call

    - by dev.e.loper
    I'm using jQuery UI Autocomplete plug-in. I'm giving it an URL to make an ajax call and retrieve data. I want to pass optional data parameters but not as part of URL. In the code they make a getJSON call and pass in 'request' parameter(which is an optional data parameter), however I don't see a way to get at this request parameter. this.source = function( request, response ) { $.getJSON( url, request, response ); };

    Read the article

  • Under what circumstances are linked lists useful?

    - by Jerry Coffin
    Most times I see people try to use linked lists, it seems to me like a poor (or very poor) choice. Perhaps it would be useful to explore the circumstances under which a linked list is or is not a good choice of data structure. Ideally, answers would expound on the criteria to use in selecting a data structure, and which data structures are likely to work best under specified circumstances.

    Read the article

  • import the data in xls file and open them without Microsoft Excel

    - by user3669577
    I need to perform an application that cath values from SQL database after the esecution of a query. I must import the data in xls file and open them without Microsoft Excel. I'm a beginner and have too many problem. Can anyone help me. This is my code, at the moment: Option Infer On Imports System.Linq Imports System.Data.SqlClient Imports System Imports System.IO Imports System.Drawing Imports System.Drawing.Printing Imports System.Windows.Forms Imports ExcelLibrary.SpreadSheet Public Class frmLottiCaricati Dim CnSql As SqlConnection Private Sub frmLottiCaricati_Load(ByVal sender As Object, ByVal e As System.EventArgs) Handles Me.Load Me.MdiParent = Inizio 'TB_MinusValenza.Text = VariazionePrezzi.MinusValenza 'TB_Periodo.Text = VariazionePrezzi.Periodo 'DG_Prodotti.AutoGenerateColumns = False Try Dim StringaSql = "Integrated Security=SSPI;Persist Security Info=False;Initial Catalog=" + Inizio.DatabaseSql + ";Data Source=" + Inizio.ServerSql + ";User ID=" + Inizio.UtenteSql + ";Password=" + Inizio.PwdSql CnSql = New SqlConnection(StringaSql) CnSql.Open() Dim command As SqlCommand Dim dadapter As New SqlDataAdapter Dim DS_Prodotti As New Data.DataSet Dim qry_Prodotti = "SELECT sistemaf.prodscadenze.Ministeriale, sistemaf.prodscadenze.Lotto, sistemaf.prodscadenze.Scadenza " & _ "FROM sistemaf.Prodscadenze " 'INNER JOIN sistemaf.Prodscadenze ON sistemaf.prodbase.Cod39 = sistemaf.prodscadenze.Ministeriale ;" command = New SqlCommand(qry_Prodotti, CnSql) dadapter.SelectCommand = command dadapter.Fill(DS_Prodotti) DG_Prodotti.DataSource = DS_Prodotti.Tables(0) 'DG_Prodotti.Columns("Descrizione").Width = 220 'DG_Prodotti.Columns("Ministeriale").Width = 60 DG_Prodotti.Columns("Lotto").Width = 60 'DG_Prodotti.Columns("Descrizione").AutoSizeMode = DataGridViewAutoSizeColumnMode.AllCells 'DG_Prodotti.Columns("Totale").DefaultCellStyle.Alignment = DataGridViewContentAlignment.MiddleRight Catch ex As Exception MessageBox.Show(ex.Message) End Try End Sub End Class I can open the data only with Microsoft Excel now. Have any suggestions?

    Read the article

  • SSIS Data Flow Task Excel Source

    - by Gerard
    Hi, I have a data flow task set up in SSIS. The source is from an Excel source not an SQL DB. The problem i seem to get is that, the package is importing empty rows. My data has data in 555200 rows, but however when importing the SSIS package imports over 900,000 rows. The extra rows are imported even though the other empty. When i then download this table into excel there are empty rows in between the data. Is there anyway i can avoid this? Thanks Gerard

    Read the article

  • Bind any version of MySql.Data using the app.config

    - by Martin Kirsche
    How do I bind any version or a range of versions of an assembly by using the app.config? I'm currently binding the MySql.Data assembly like this: <runtime> <assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1" applies-to="v2.0.50727"> <qualifyAssembly partialName="MySql.Data" fullName="MySql.Data, Version=6.2.2.0, Culture=neutral, PublicKeyToken=c5687fc88969c44d, processorArchitecture=MSIL"/> </assemblyBinding> </runtime> Any version of MySql.Data other than 6.2.2.0 is not working this way. The versions of this assembly are changing fast so I either want to bind any or all versions beginning with 6.2 to my application without changing the app.config each time.

    Read the article

  • Limit the model data fields serialized by Web API based on the return type Interface

    - by Stevo3000
    We're updating our architecture to use a single object model for desktop, web and mobile that can be used in the MVVM pattern. I would like to be able to limit the data fields that are serialized through Web API by using interfaces on the controllers. This is required because the model objects for mobile are stored in HTML5 local storage so don't carry optional data while a thin desktop client would be able to store (and work with) more data. To achieve this a model will implement the different interfaces that define which data fields should be serialized and there will be a controller specific to the interface. The problem is that the Web API always serializes every field in the model even if it is not part of the interface being returned. How can we only serialize fields in the returned interface?

    Read the article

  • Recommendations for an in memory database vs thread safe data structures

    - by yx
    TLDR: What are the pros/cons of using an in-memory database vs locks and concurrent data structures? I am currently working on an application that has many (possibly remote) displays that collect live data from multiple data sources and renders them on screen in real time. One of the other developers have suggested the use of an in memory database instead of doing it the standard way our other systems behaves, which is to use concurrent hashmaps, queues, arrays, and other objects to store the graphical objects and handling them safely with locks if necessary. His argument is that the DB will lessen the need to worry about concurrency since it will handle read/write locks automatically, and also the DB will offer an easier way to structure the data into as many tables as we need instead of having create hashmaps of hashmaps of lists, etc and keeping track of it all. I do not have much DB experience myself so I am asking fellow SO users what experiences they have had and what are the pros & cons of inserting the DB into the system?

    Read the article

  • to change xml data to ArrayCollection

    - by krishna
    I have xml file with data as below and i want to convert this into Flex ArrayCollection including the id and name of the tags. I am using httpService to get the file. data.xml <data> <result month="Jan" value="0.666"> <info id="jan01Display" name="jhon" age="20" /> <info id="jan02Display" name="adams" age="24" /> <info id="jan03Display" name="prasad" age="27" /> </result> </data>

    Read the article

  • updating batches of data

    - by gaponte69
    I am using GridView in asp .net and editing data with edit command field property (as we know after updating the edited row, we automatically update the database), and I want to use transactions (with begin to commit statement - including rollback) to commit this update query in database, after clicking in some button (after some events for example), not automatically to insert or update the edited data from grid directly to the DB...so I want to save them somewhere temporary (even many edited rows - not just one row) and then to confirm the transaction - to update the real tables in database... Any suggestions are welcomed... I've used some good links, but very helpful, like: http://www.asp.net/learn/data-access/tutorial-63-cs.aspx http://www.asp.net/learn/data-access/tutorial-66-cs.aspx etc...

    Read the article

  • How can I visualise a "broken" hierarchical dataset?

    I have a reasonably large datatable structured something like this: StaffNo Grade Direct Boss2 Boss3 Boss4 Boss5 Boss6 ------- ----- ----- ----- ----- ----- ----- ----- 10001 1 10002 10002 10057 10094 10043 10099 10002 2 10057 NULL 10057 10094 10043 10099 10003 1 10004 10004 10057 10094 10043 10099 10004 2 10057 NULL 10057 10094 10043 10099 10057 3 10094 NULL NULL 10094 10043 10099 etc.... i.e. a unique id , their level (grade) in the hierarchy, a record of their bosses ID and the IDs of the supervisors above. (The 2,3,4, etc refers to the boss at that particular grade). The system relies on a strict hierarchy - if you are my boss (/parent) then your boss must be my grandparent. Unfortunately this rule is not enforced within the data model and the data ultimately comes from other systems which don't even know about the rule, let alone observe it. So you and I may share the same boss, but our bosses boss won't be the same. note: I cannot change the data model I cannot fix the data at source. So (for the moment) I have to fix the data once it's in place. Once a fortnight someone will do something which breaks the model and I'll need to modify the procs slightly to resolve. Not ideal, but I'm stuck with this for the next six months. Anyway, specific queries are easy to produce but I find it hard to keep track of the bigger picutre. The application which sits on this runs without complaint regardless but navigating around the system becoming extraordinarily confusing. So my question is: Can anyone recommend a tool (or technique) for generating some kind of "broken tree" diagram in this sort of circumstances? I don't want something that will fix things for me, or attempt statistical analysis but at least something that will give a visual indication of how broken it is at any one time. Note : At the moment this is in a SQL Server database but I'm open to ideas utilising C#, Perl or Python.

    Read the article

  • Why won't Ubuntu copy large files to FAT32 flash Drives?

    - by yurividal
    Since I installed 11.10 I am unable to copy large files (say 1gb or more) to ANY usb drive that is formated as FAT. The file starts copying, but soon an error appears, saying "Unable to Copy" . "Error splicing file: Input/output error". I am able to do it via terminal, using the cp command. I use Gnome3, but the same error has happened in Unity as well. Apparently it works if I format the USB drive as NTFS or EXT3, EXT4. But, for many appliances, FAT is necessary. The problem is also not with the USB port, because it works under Windows. It did not happen before, when I had 10.04 installed.

    Read the article

  • Push or Pull to Excel for reporting data

    - by Nathan Fisher
    I am unsure which is the best way to go here. I have a third party Excel 2003 spreadsheet that needs to be filled in on a monthly basis and emailed. Currently it is a manual process and I am in the process of automating the generation of the spreadsheet. I have been throwing around different ideas of how to get the data into the spreadsheet. I have thought of using SSRS to create a report that is in a similar format and get the user to cut and past. Alternatively writing a VBA addin that retrieves that data from a webservice and then adds the data to the spreadsheet. Or using the third party spreadsheet as a template and open it on the server via oledb and adding the data then serving it as a downloadable file. Which is better or are the better solutions out there?

    Read the article

  • Which Large File System Format to use for USB Flash drive compatible with Ubuntu/Mac/Windows?

    - by wajiw
    I've had this problem for a long time and can't find a solution. I switch between the 3 OSes all the time and use a 1TB USB Drive to do so. I can't seem to find a format that is compatible across all systems that handles large files (at least 8-9 GB). Does anyone have a solution for this? Recently I've tried exFat but that messes up the filesystem when trying to read on windows after adding files from Ubuntu (using the fuse driver). The OSes currently I'm using are Windows Vista/7, Mac OS X (10.6.5) and Ubuntu 10.10

    Read the article

  • Updating Data through Objects

    - by Chacha102
    So, lets say I have a record: $record = new Record(); and lets say I assign some data to that record: $record->setName("SomeBobJoePerson"); How do I get that into the database. Do I..... A) Have the module do it. class Record{ public function __construct(DatabaseConnection $database) { $this->database = $database; } public function setName($name) { $this->database->query("query stuff here"); $this->name = $name; } } B) Run through the modules at the end of the script class Record{ private $changed = false; public function __construct(array $data=array()) { $this->data = $data; } public function setName($name) { $this->data['name'] = $name; $this->changed = true; } public function isChanged() { return $this->changed; } public function toArray() { return $this->array; } } class Updater { public function update(array $records) { foreach($records as $record) { if($record->isChanged()) { $this->updateRecord($record->toArray()); } } } public function updateRecord(){ // updates stuff } }

    Read the article

  • Export data as Excel file from ASP.NET

    - by Yongwei Xing
    Hi all I have data like below AAAAAA BBBBBB CCCCCC DDDDDD EEEEEE Now there is button on the page,when I click the button, the browser would download a excel file with the data above, and stay current page. Is there any simple way to do it? The data is very simple. only one column, and not huge. Best Regards,

    Read the article

  • What is a good approach to preloading data?

    - by Bob Horn
    Are there best practices out there for loading data into a database, to be used with a new installation of an application? For example, for application foo to run, it needs some basic data before it can even be started. I've used a couple options in the past: TSQL for every row that needs to be preloaded: IF NOT EXISTS (SELECT * FROM Master.Site WHERE Name = @SiteName) INSERT INTO [Master].[Site] ([EnterpriseID], [Name], [LastModifiedTime], [LastModifiedUser]) VALUES (@EnterpriseId, @SiteName, GETDATE(), @LastModifiedUser) Another option is a spreadsheet. Each tab represents a table, and data is entered into the spreadsheet as we realize we need it. Then, a program can read this spreadsheet and populate the DB. There are complicating factors, including the relationships between tables. So, it's not as simple as loading tables by themselves. For example, if we create Security.Member rows, then we want to add those members to Security.Role, we need a way of maintaining that relationship. Another factor is that not all databases will be missing this data. Some locations will already have most of the data, and others (that may be new locations around the world), will start from scratch. Any ideas are appreciated.

    Read the article

  • What is the recommended way to pass data back and forth between two threads using C#

    - by kenalex
    I am trying to make an app that will pass data between two servers Connection1 and Conenction2 using sockets.What i would like to do is receive data from Connection1 and pass it to Connection2 and vice-versa.Connection1 and Conenction2 are on different threads. What is the best way to call methods on different threads in order to pass data back and forth between them.Both threads will use the same message object type to communicate in both directions between them. Thanks

    Read the article

  • WPF Trigger when Property and Data value are true

    - by KrisTrip
    I need to be able to change the style of a control when a property and data value are true. For example, my bound data has an IsDirty property. I would like to change the background color of my control when IsDirty is true AND the control is selected. I found the MultiTrigger and MultiDataTrigger classes...but in this case I need to somehow trigger on data and property. How can I do this?

    Read the article

  • cross-domain data with AJAX using JSONP

    - by kooshka
    I'm trying to get data from Geobytes. One of the templates returns JSON and I need to cross-domain access it. I wrote these 2 functions function getCountry(ip) { var surl = "http://www.geobytes.com/IpLocator.htm?GetLocation&template=json.txt"; $.ajax({ url: surl, data: '{"ipaddress":"' + ip + '"}', dataType: "jsonp", jsonp: "callback", jsonpCallback: "jsonpcallback" }); } function jsonpcallback(rtndata) { alert(rtndata.message); } The call is executed but I get 1 warning: Resource interpreted as Script but transferred with MIME type text/html: "http://www.geobytes.com/IpLocator.htm?GetLocation&template=json.txt&callback=jsonpcallback&{%22ipaddress%22:%22200.167.254.166%22}&_=1353148931121" 1 Error: Uncaught SyntaxError: Unexpected token : The error is thrown on the returned data at {"geobytes":{"countryid":117, I think is maybe because it's 117 and not "117" but I obviously can't control the returned data How can I fix these 2 issues?

    Read the article

  • jQuery Sparklines: $.getJSON data can't be read

    - by Bob Jansen
    I'm trying to generate a pie graph with Sparklines but I'm running into some trouble. I can't seem to figure out what I'm doing wrong, but I feel it is a silly mistake. I'm using the following code to generate a sparkline chart in the div #traffic_bos_ss: //Display Visitor Screen Size Stats $.getJSON('models/ucp/traffic/traffic_display_bos.php', { type: 'ss', server: server, api: api, ip: ip, }, function(data) { var values = data.views; //alert(values); $('#traffic_bos_ss').sparkline(values, { type: "pie", height: "100%", tooltipFormat: 'data.screen - {{value}}', }); }); The JSON string fetched: {"screen":"1220x1080, 1620x1080, 1920x1080","views":"[2, 2, 61]"} For some reason Sparklines does not process the variable values. When I alert the variable it outputs "[2, 2, 61]". Now the jQuery code does work when I replace the snippet: var values = data.views; with var values = [2, 2, 61]; What am I doing wrong?

    Read the article

  • Delete Range of Data From Text File With PHP

    - by Evan Byrne
    I want to delete a range of data from a text file using PHP. Let's assume the file contains the following: Hello, World! I want to delete everything from character 2 to character 7. The actual file I need to do this with is very large, so I don't want to have to read the large file in order to delete just a small, given range of data. The data contained within the given range is not known, so str_replace or preg_replace solutions wouldn't work anyways. Thanks!

    Read the article

  • Providing an application data update from a website

    - by Craig Johnston
    I need to provide an update to application data as a download from a website. The update would actually just be the replacing of some data files with some updated ones. The update, which I assume would be some sort of setup package type program, would need to be able to do the following: access the file system and registry to determine where files should be copied to supply the files to be copied provide strong security so the data files cannot be downloaded or used by the wrong people What would be best way to achieve all of the above?

    Read the article

< Previous Page | 220 221 222 223 224 225 226 227 228 229 230 231  | Next Page >