Search Results

Search found 6329 results on 254 pages for 'linq to csv'.

Page 196/254 | < Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >

  • Sqlite3 Database versus populating Arrays

    - by Kenoy
    hi, I am working on a program that requires me to input values for 12 objects, each with 4 arrays, each with 100 values. (4800) values. The 4 arrays represent possible outcomes based on 2 boolean values... i.e. YY, YN, NN, NY and the 100 values to the array are what I want to extract based on another inputted variable. I previously have all possible outcomes in a csv file, and have imported these into sqlite where I can query then for the value using sql. However, It has been suggested to me that sqlite database is not the way to go, and instead I should populate using arrays hardcoded. Which would be better during run time and for memory management?

    Read the article

  • How to debug/reformat C printf calls with lots of arguments in vim?

    - by Costi
    I have a function call in a program that I'm maintaining has 28 arguments for a printf call. It's printing a lot of data in a CSV file. I have problems following finding where what goes and I have some mismatches in the parameters types. I enabled -Wall in gcc and I get warnings like: n.c:495: warning: int format, pointer arg (arg 15) n.c:495: warning: format argument is not a pointer (arg 16) n.c:495: warning: double format, pointer arg (arg 23) The function is like this: fprintf (ConvFilePtr, "\"FORMAT3\"%s%04d%s%04d%s%s%s%d%s%c%s%d%c%s%s%s%s%s%s%s%11.lf%s%11.lf%s%11.lf%s%d\n", some_28_arguments_go_here); I would like to know if there is a vim plugin that highlights the printf format specifier when i go with the cursor over a variable. Other solutions? How to better reformat the code to make it more readable?

    Read the article

  • Creating an Excel Template for different data size

    - by dassouki
    I created an excel template for a file i've done for a routine work calculation. The file takes data from the data logger and does some analysis on it and outputs one number regardless of the input size. The problem I'm having is i have to modify the sheet to suit the number of rows, as everyday the data logger outputs a different number of rows. there are about 15 sheets in the workbook and it's annoying to have to change everyone of them everyday. What i'd like to do input the data logger csv, and boom the result gets outputted. Is there a way through vba or not to ahieve

    Read the article

  • Trouble getting email attachment from Exchange

    - by JimR
    I am getting the error message “The remote server returned an error: (501) Not Implemented.” when I try to use the HttpWebRequest.GetResponse() using the GET Method to get an email attachment from exchange. I have tried to change the HttpVersion and don’t think it is a permissions issue since I can search the inbox. I know my credentials are correct as they are used to get HREF using the HttpWebRequest.Method = Search on the inbox (https://mail.mailserver.com/exchange/testemailaccount/Inbox/). HREF = https://mail.mailserver.com/exchange/testemailaccount/Inbox/testemail.EML/attachment.csv Sample Code: HttpWebRequest req = (System.Net.HttpWebRequest) HttpWebRequest.CreateHREF); req.Method = "GET"; req.Credentials = this.mCredentialCache; string data = string.Empty; using (WebResponse resp = req.GetResponse()) { Encoding enc = Encoding.Default; if (resp == null) { throw new Exception("Response contains no information."); } using (StreamReader sr = new StreamReader(resp.GetResponseStream(), Encoding.ASCII)) { data = sr.ReadToEnd(); } }

    Read the article

  • using subset but old variables still left

    - by user2520852
    I am working with a data set, which is basically daily usage data (let's just say variable X and Y) by different cities (about 150 cities). I have created a subset of data for only specific cities, choosing just 3 of the 150 cities. Then when I do tapply by cities, I get means for 3 cities but also get NA for all other 147 cities that was in the data set. I am using the below coding df<-read.csv(...) df_sub<-subset(df,df$City==1|df$City==3|df$City==19) X_Breakdown<-tapply(X,df_sub$City, mean, na.rm=TRUE) Print(X_Breakdown) City 1 City 2 15 NA City 3 City 4 12 NA City 5 City 6 NA NA Hope you get the idea. I would like to get a dataset that only contains the 3 cities that I'm interested in. It seems that the set of variables is encoded in R, is there a way to fix this? Kindly advise. Thanks

    Read the article

  • Export Multiple Sheets to Excel Through Browser

    - by ProfK
    I need to export multiple data tables to Excel on the clients machine, each to their own sheet. If it was just one sheet, I'd use the Excel/csv content type, but I've heard something about an XML format that can represent an entire workbook. I don't want to go down the Packaging and .xlsx route, so I need standard .xls. Our bug tracker, Gemini, used to have an export function that produced an XML file that Excel automatically opened as a multi-sheet workbook, but I can't find it. Is there still such a mechanism, and where can I find that schema?

    Read the article

  • How can I deploy a Perl/Python/Ruby script without installing an interpreter?

    - by Brian G
    I want to write a piece of software which is essentially a regex data scrubber. I am going to take a contact list in CSV and remove all non-word characters and such from the person's name. This project has Perl written all over it but my client base is largely non-technical and installing Perl on Windows would not be worth it for them. Any ideas on how I can use a Perl/Python/Ruby type language without all the headaches of getting the interpreter on their computer? Thought about web for a second but it would not work for business reasons.

    Read the article

  • Excel Spreadsheet - Best way to perform an Oracle Query on a cell

    - by Jamie
    Hi there, I have an Excel Spreadsheeet. There is a cell containing a concatenated name and surname (don't ask why), for example: Cell A2 BLOGGSJOE On this cell, I would like to run the following SQL and output it to cell A3, A4 and A5 SELECT i.id, i.forename, i.surname FROM individual i WHERE UPPER(REPLACE('" & A2 & "', ' ', '')) = UPPER(REPLACE(i.surname|| i.forename, ' ', '')) AND NVL(i.ind_efface, 'N') = 'N' Any idea how I could perform an oracle query on each cell and return the result? I have enabled an oracle datasource connection in Excel, just not sure what to do now. Is this a stupid approach, and can you recommend a better more proficient way? Thanks muchly! I lack the necessary experience in this type of thing! :-) EDIT: I am aware that I could just write a simple ruby/php/python/whatever script to loop through the excel spreadsheet (or csv file), and then perform the query etc. but i thought there might be a quick way in excel itself.

    Read the article

  • Preview result of update/insert query without comitting changes to database in MySQL?

    - by Camsoft
    I am writing a script to import CSV files into existing tables within my database. I decided to do the insert/update operations myself using PHP and INSERT/UPDATE statements, and not use MySQL's LOAD INFILE command, I have good reasons for this. What I would like to do is emulate the insert/update operations and display the results to the user, and then give them the option of confirming that this is OK, and then committing the changes to the database. I'm using InnoDB database engine with support for transactions. Not sure if this helps but was thinking down the line of insert/update, query data, display to user, then either commit or rollback transaction? Any advise would be appreciated.

    Read the article

  • Doing a ajax / json add to database, and have a "wait doing operation" icon

    - by Dejan.S
    Hi. I got a part on my page I want to improve. It's a file upload that users can add their contacts from files like excel, csv & outlook. I read the contacts and place them in the database, so what I would like to do is to have a regular icon that spins while that operation is doing that, how could I do that? Ajax? I don't want progress bar for the file upload but the operation for reading the file EDIT: I want to know how to make this work with the add to database using ajax. like should I use a updatepanel? Thanks

    Read the article

  • Error on SQL insert statement

    - by Ashley Stewart
    I exported a recordset from one database into a csv file, and when I try to import it into another using mysql workbench I keep this this error message: Executing SQL script in server ERROR: Error 1064: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near ' 'Lord it Over', 'Ben', '1993-03-01', 'TRC', NULL, 1983, '1999-09-01', 'NULL', '' at line 1 INSERT INTO `TRC`.`horse` (`horse_id`, `registered_name`, `stable_name`, `arrival_date`, `last_known_location`, `is_ex_racer`, `birth_year`, `death_date`, `horse_comments`, `sex`, `referral_date`, `horse_height`, `arrival_weight`, `passport_no`, `microchip_no`, `is_on_waiting_list`) VALUES (, 'Lord it Over', 'Ben', '1993-03-01', 'TRC', NULL, 1983, '1999-09-01', 'NULL', 'NULL', 'NULL', NULL, NULL, 'NULL', 'NULL', 0) SQL script execution finished: statements: 29 succeeded, 1 failed Fetching back view definitions in final form. Nothing to fetch Any help would be appreciated as their appears to be no errors as far as I can see.

    Read the article

  • Powershell Select-Object from array not working

    - by Andrew
    I am trying to seperate values in an array so i can pass them to another function. Am using the select-Object function within a for loop to go through each line and separate the timestamp and value fields. However, it doesn't matter what i do the below code only displays the first select-object variable for each line. The second select-object command doesn't seem to work as my output is a blank line for each of the 6 rows. Any ideas on how to get both values $ReportData = $SystemStats.get_performance_graph_csv_statistics( (,$Query) ) ### Allocate a new encoder and turn the byte array into a string $ASCII = New-Object -TypeName System.Text.ASCIIEncoding $csvdata = $ASCII.GetString($ReportData[0].statistic_data) $csv2 = convertFrom-CSV $csvdata $newarray = $csv2 | Where-Object {$_.utilization -ne "0.0000000000e+00" -and $_.utilization -ne "nan" } for ( $n = 0; $n -lt $newarray.Length; $n++) { $nTime = $newarray[$n] $nUtil = $newarray[$n] $util = $nUtil | select-object Utilization $util $tstamp = $nTime | select-object timestamp $tstamp }

    Read the article

  • Python - Finding unicode/ascii problems

    - by user330739
    Hi all, I am csv.reader to pull in info from a very long sheet. I am doing work on that data set and then I am using the xlwt package to give me a workable excel file. However, I get this error: UnicodeDecodeError: 'ascii' codec can't decode byte 0x92 in position 34: ordinal not in range(128) My question to you all is, how can I find exactly where that error is in my data set? Also, is there some code that I can write which will look through my data set and find out where the issues lie (because some data sets run without the above error and others have problems)?

    Read the article

  • How do you make your Java application memory efficient?

    - by Boune
    How do you optimize the heap size usage of an application that has a lot (millions) of long-lived objects? (big cache, loading lots of records from a db) Use the right data type Avoid java.lang.String to represent other data types Avoid duplicated objects Use enums if the values are known in advance Use object pools String.intern() (good idea?) Load/keep only the objects you need I am looking for general programming or Java specific answers. No funky compiler switch. Edit: Optimize the memory representation of a POJO that can appear millions of times in the heap. Use cases Load a huge csv file in memory (converted into POJOs) Use hibernate to retrieve million of records from a database Resume of answers: Use flyweight pattern Copy on write Instead of loading 10M objects with 3 properties, is it more efficient to have 3 arrays (or other data structure) of size 10M? (Could be a pain to manipulate data but if you are really short on memory...)

    Read the article

  • Remove objects from different environments

    - by Fred
    I have an R script file that executes a second R script via: source("../scripts/second_file.R") That second file has the following lines: myfiles <- list.files(".",pattern = "*.csv") ... rm(myfiles) When I run the master R file I get: > source("../scripts/second_file.R") Error in file.remove(myfiles) : object 'myfiles' not found and the program aborts. I think this has something to do with the environment. I looked at ?rm() pages but less than illuminating. I figure I have to give it a position argument, but not sure which.

    Read the article

  • How do C or .NET programmers store and load strings in their programs?

    - by Ivan Ivkovic
    I've been doing PHP and stuff for the last year; I just got into a bit of C and C++. In the book I'm just reading, all the strings are actually in the code (I realize this is just for example, but just curious). My interest is — is there a common way for programmers to store strings and display them? Does .NET have some predefined way of doing this — like Android does in strings file? (In PHP, I keep them in all CSV files completely separate from code.)

    Read the article

  • Transferring data to (Windows) Mobile Devices

    - by Ritu
    I created an app for Windows Mobile 6.5 and am fairly happy with it. However, if anyone else need to use this app, they will have to transfer an initial file (txt or csv) to the device. For a developer this isn't a problem but is this too much to ask of an end user? Granted, they will want to move (sync) data back to their desktop after the device's data have been updated. So how do other apps solve this problem? Do I need to provide some kind of syncing software?

    Read the article

  • C#: split a string into runs of characters, numbers and delimited strings and process it

    - by nrkn
    OK my regex is a bit rusty and I've been struggling with this particular problem... I need to split and process a string containing any number of the following, in any order: Chars (lowercase letters only) Quote delimited strings Ints The strings are pretty weird (I don't have control over them). When there's more than one number in a row in the string they're seperated by a comma. They need to be processed in the same order that they appeared in the original string. For example, a string might look like: abc20a"Hi""OK"100,20b With this particular string the resulting call stack would look a bit like: ProcessLetters( new[] { 'a', 'b', 'c' } ); ProcessInts( 20 ); ProcessLetters( 'a' ); ProcessStrings( new[] { "Hi", "OK" } ); ProcessInts( new[] { 100, 20 } ); ProcessLetters( 'b' ); What I could do is treat it a bit like CSV, where you build tokens by processing the characters one at a time, but I think it could be more easily done with a regex?

    Read the article

  • Storing an arbitrary R object onto HDD?

    - by Harokitty
    I understand that we can export data matrices to csv or xlsx files. What about complex objects like lm? For example, in my work I might have a list of length 1000, each with a single lm() object. Each time I load R I have to wait a long time to populate the 1000 length list with these lm objects with a for loop or a lapply. I would rather just save the list somewhere on my HDD at the end of a session and open it at the start of the next session.

    Read the article

  • How to insert null value for numeric field of a datatable in c#?

    - by Pandiya Chendur
    Consider My dynamically generated datatable contains the following fileds Id,Name,Mob1,Mob2 If my datatable has this it get inserted successfully, Id Name Mob1 Mob2 1 acp 9994564564 9568848526 But when it is like this it gets failed saying, Id Name Mob1 Mob2 1 acp 9994564564 The given value of type String from the data source cannot be converted to type decimal of the specified target column. I generating my datatable by readingt a csv file, CSVReader reader = new CSVReader(CSVFile.PostedFile.InputStream); string[] headers = reader.GetCSVLine(); DataTable dt = new DataTable(); foreach (string strHeader in headers) { dt.Columns.Add(strHeader); } string[] data; while ((data = reader.GetCSVLine()) != null) { dt.Rows.Add(data); } Any suggestion how to insert null value for numeric field during BulkCopy in c#... EDIT: I tried this dt.Columns["Mob2"].AllowDBNull = true; but it doesn't seem to work...

    Read the article

  • Matlab time stamps reading

    - by Paul
    Any easy way to read all the columns in Matlab? my format is date time y1 y2 y3 y4 ......................... 4/27/2010 00:3:09 34 45 45 56 ................ so on currently i am reading these with the code [c,pathc]=uigetfile({'*.txt'},'Select the data','C:\Data'); file=[pathc c]; data= dlmread(file, ',', 1,3); so needless to say i am skipping the time stamps. Was wondering if there is aeasy way to read the time stamps and plot my other colums agianst the time in hours. my files are 43200 X 30 and some are 86400 X 90 Related question : is the format same for .csv and .xls files , i mean except ofcourse xlsread

    Read the article

  • R problem with apply + rbind

    - by Carl
    I cannot seem to get the following to work directory <- "./" files.15x16 <- c("15x16-70d.out", "15x16-71d.out") data.15x16<-rbind( lapply( as.array(paste(directory, files.15x16, sep="")), FUN=read.csv, sep=" ", header=F) ) What it should be doing is pretty straightforward - I have a directory name, some file names, and actual files of data. I paste the directory and file names together, read the data from the files in, and then rbind them all together into a single chunk of data. Except the result of the lapply has the data in [[]] - i.e., accessing it occurs via a[[1]], a[[2]], etc which rbind doesn't seem to accept. Suggestions?

    Read the article

  • Rails fixtures seem to be adding extra unexpected data

    - by Mason Jones
    Hello, all. I've got a dynamic fixture CSV file that's generating predictable data for a table in order for my unit tests to do their thing. It's working as expected and filling the table with the data, but when I check the table after the tests run, I'm seeing a number of additional rows of "blank" data (all zeros, etc). Those aren't being created by the fixture, and the unit tests are read-only, just doing selects, so I can't blame the code. There doesn't seem to be any logging done during the fixtures setup, so I can't see when the "blank" data is being inserted. Anyone ever run across this before, or have any ideas of how to log or otherwise see what the fixture setup is doing in order to trace down the source of the blank data?

    Read the article

  • Implementing a runtime Look Up Table in C#

    - by Yarok
    Hey all, I'm currently working on a robot interface GUI, using C#. The robot has two sensors, and two powered wheels. I need to let the user the option to load a Look Up Table (LUT) during runtime, one for each sensor, that will tell the robot what to do according to the sensor's reading. I think the best way to do it is using a .csv file, formatted like so: index , right wheel order, left wheel order the index is an int between 0-1023 and is actually the sensor's reading. the orders for the right and left wheel are integers, between -500 - 500. Example - left sensor's readings: 1,10,20 meaning: sensor reads 1 -- left wheel 10 rpm right wheel 20 rpm So my question is this: what is the best way to implement it? using a dataset?(if so, how?) using an array? (if so, how do I load it during runtime?) Any help would be much appreciated, Yarok

    Read the article

  • k-means clustering in R on very large, sparse matrix?

    - by movingabout
    Hello, I am trying to do some k-means clustering on a very large matrix. The matrix is approximately 500000 rows x 4000 cols yet very sparse (only a couple of "1" values per row). The whole thing does not fit into memory, so I converted it into a sparse ARFF file. But R obviously can't read the sparse ARFF file format. I also have the data as a plain CSV file. Is there any package available in R for loading such sparse matrices efficiently? I'd then use the regular k-means algorithm from the cluster package to proceed. Many thanks

    Read the article

< Previous Page | 192 193 194 195 196 197 198 199 200 201 202 203  | Next Page >