Search Results

Search found 52589 results on 2104 pages for 'read table'.

Page 110/2104 | < Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >

  • Read a file with 2048 bytes

    - by Suresh S
    Guys i have a file which has only one line. The file has no encoding it is a simple text file with single line. For every 2048 byte in a line , there is new record of 151 byte (totally 13*151 byte = 1945 records + 85 byte empty space). similarly for the next 2048 bytes. What is the best file i/o to use? i am thinking of reading 2048 bytes from file and storing it in an array . while (offset < fileLength &&(numRead=in.read(recordChunks, offset,alength)) >= 0) { } how can i get from the read statement only 2048 bytes at a time . i am getting IndexOutofBoundException.

    Read the article

  • Create unique identifier for different row-groups

    - by Max van der Heijden
    I want to number certain combinations of row in a dataframe (which is ordered on ID and on Time) tc <- textConnection(' id time end_yn number abc 10 0 1 abc 11 0 2 abc 12 1 3 abc 13 0 1 def 10 0 1 def 15 1 2 def 16 0 1 def 17 0 2 def 18 1 3 ') test <- read.table(tc, header=TRUE) The goal is to create a new column ("journey_nr") that give a unique number to each row based on the journey it belongs to. Journeys are defined as a sequence of rows per id up until to end_yn == 1, also if end_ynnever becomes 1, the journey should also be numbered (see the expected outcome example). It is only possible to have end_yn == 0 journeys at the end of a collection of rows for an ID (as shown at row 4 for id 3). So either no end_yn == 1 has occured for that ID or that happened before the end_yn == 0-journey (see id == abc in the example). I know how to number using the data.table package, but I do not know which columns to combine in order to get the expected outcome. I've searched the data.table-tag on SO, but could not find a similar problem. Expected outcome: id time end_yn number journey abc 10 0 1 1 abc 11 0 2 1 abc 12 1 3 1 abc 13 0 1 2 def 10 0 1 3 def 15 1 2 3 def 16 0 1 4 def 17 0 2 4 def 18 1 3 4

    Read the article

  • jQuery hide all table rows which contain a hidden field matching a value

    - by Famous Nerd
    Though I don't doubt this has been answered I cannot find a great match for my question. I have a table for which I'd like to filter rows based on whether or not they contain a hidden field matching a value. I understand that the technique tends to be "show all rows", "filter the set", "show/hide that filtered set" I have the following jquery but I'm aweful with filter and my filtered set seems to always contain no elements. my table is the usual <table> <tr><td>header></td><td>&nbsp;</tr> <tr> <td>a visible cell</td><td><input type='hidden' id='big-asp.net-id' value='what-im-filtering-on' /> </td> </tr> </table> My goal is to be able to match on tr who's descendent contains a hidden input containing either true or false. this is how I've tried the selector (variations of this) and I'm not even testing for the value yet. function OnFilterChanged(e){ //debugger; var checkedVal = $("#filters input[type='radio']:checked").val(); var allRows = $("#match-grid-container .tabular-data tr"); if(checkedVal=="all"){ allRows.show(); } else if(checkedVal=="matched"){ allRows.show(); allRows.filter(function(){$(this).find("input[type='hidden'][id~='IsAutoMatchHiddenField']")}).hide(); } else if(checkedVal=="unmatched"){ } } Am I way off with the filter? is the $(this) required in the filter so that i can do the descendant searching? Thanks kindly

    Read the article

  • Controlling read and write access width to memory mapped registers in C

    - by srking
    I'm using and x86 based core to manipulate a 32-bit memory mapped register. My hardware behaves correctly only if the CPU generates 32-bit wide reads and writes to this register. The register is aligned on a 32-bit address and is not addressable at byte granularity. What can I do to guarantee that my C (or C99) compiler will only generate full 32-bit wide reads and writes in all cases? For example, if I do a read-modify-write operation like this: volatile uint32_t* p_reg = 0xCAFE0000; *p_reg |= 0x01; I don't want the compiler to get smart about the fact that only the bottom byte changes and generate 8-bit wide read/writes. Since the machine code is often more dense for 8-bit operations on x86, I'm afraid of unwanted optimizations. Disabling optimizations in general is not an option.

    Read the article

  • CSV file read fail (PHP )

    - by user1020069
    I am trying to read a csv file (delimited by commas) but unfortunately, it isn't responding as it ought to. I am not so sure what I am doing wrong here, but I'll paste out the contents of the code and the CSV file both : $row = 0; if($handle = fopen("SampleQuizData.csv","r") !== FALSE) { // WORKS UNTIL HERE, SO FILE IS BEING READ while(!feof(handle)){ $line = fgetcsv($handle, 1024, ",") ; echo $line[2]; // DOES NOT WORK } } And the CSV file is (the emails and names have been changed here to protect the identities of the users) parijat,something,[email protected] matthew,durp, [email protected] steve,vai,[email protected] rajni,kanth,[email protected]

    Read the article

  • Update table using SSIS

    - by thursdaysgeek
    I am trying to update a field in a table with data from another table, based on a common key. If it were in straight SQL, it would be something like: Update EHSIT set e.IDMSObjID = s.IDMSObjID from EHSIT e, EHSIDMS s where e.SITENUM = s.SITE_CODE However, the two tables are not in the same database, so I'm trying to use SSIS to do the update. Oh, and the sitenum/site_code are varchar in one and nvarchar in the other, so I'll have to do a data conversion so they'll match. How do I do it? I have a data flow object, with the source as EHSIDMS and the destination as EHSIT. I have a data conversion to convert the unicode to non-unicode. But how do I update based on the match? I've tried with the destination, using a SQL Command as the Data Access mode, but it doesn't appear to have the source table. If I just map the field to be updated, how does it limit it based on fields matching? I'm about to export my source table to Excel or something, and then try inputting from there, although it seems that all that would get me would be to remove the data conversion step. Shouldn't there be an update data task or something? Is it one of those Data Flow transformation tasks, and I'm just not figuring out which it is?

    Read the article

  • Positioning Photos in a Grid (HTML)

    - by Daniel O'Connor
    Hey Everyone, I've been trying to code this page for a while, but my biggest problem is that I can't seem to get the photos perfectly positioned. For some reason, there is a small bottom padding in each <td>which is messing things up. Here is the table code: <table> <tr> <td rowspan="2" style="height:353px;"><img src="danoconnor/img/photography/farm.jpg" height="353" width="470" alt="Farm" /></td> <td><img src="danoconnor/img/photography/paragliding.jpg" height="190" width="254" alt="Paraglider" /></td> <td rowspan="2"><img src="danoconnor/img/photography/cristo.jpg" height="353" width="230" alt="Cristo Redentor" /></td> </tr> <tr> <td><img src="danoconnor/img/photography/u2.jpg" height="154" width="254" alt="U2 at Fordham University" /></td> </tr> </table> My question is: how can I make the photogrid look like this? Thanks!

    Read the article

  • Create new table with Wordpress API

    - by Fire G
    I'm trying to create a new plugin to track popular posts based on views and I have everything done and ready to go, but I can't seem to create a new table using the Wordpress API (I can do it with standard PHP or with phpMyAdmin, but I want this plugin to be self-sufficient). I've tried several ways ($wpdb-query, $wpdb-get_results, dbDelta) but none of them will create the new table. function create_table(){ global $wpdb; $tablename = $wpdb->prefix.'popular_by_views'; $ppbv_table = $wpdb->get_results("SHOW TABLES LIKE '".$tablename."'" , ARRAY_N); if(is_null($ppbv_table)){ $create_table_sql = "CREATE TABLE '".$tablename."' ( 'id' BIGINT(50) NOT NULL AUTO_INCREMENT, 'url' VARCHAR(255) NOT NULL, 'views' BIGINT(50) NOT NULL, PRIMARY KEY ('id'), UNIQUE ('id') );"; $wpdb->show_errors(); $wpdb->flush(); if(is_null($wpdb->get_results("SHOW TABLES LIKE '".$tablename."'" , ARRAY_N))) echo 'crap, the SQL failed.'; } else echo 'table already exists, nothing left to do.';}

    Read the article

  • read in bash on tab-delimited file without empty fields collapsing

    - by Charles Duffy
    I'm trying to read a multi-line tab-separated file in bash. The format is such that empty fields are expected. Unfortunately, the shell is collapsing together field separators which are next to each other, as so: # IFS=$'\t' # read one two three <<<$'one\t\tthree' # printf '<%s> ' "$one" "$two" "$three"; printf '\n' <one> <three> <> ...as opposed to the desired output of <one> <> <three>. Can this be resolved without resorting to a separate language (such as awk)?

    Read the article

  • How to link to an Excel pivot table that will expand over time in Word 2007?

    - by Daljit Dhadwal
    I have a pivot table in Excel 2007 which I’ve pasted it into Word 2007 using Paste Special (Paste link) - Microsoft Office Excel Worksheet Object. The pivot table appears in Word and the link to Excel is working. The problem is that if the pivot table expands (for example, due to showing 12 months of data rather than six months) the link to the pivot table in Word will only show the range cells that were originally copied over with the pivot table. I understand why this happens. When I paste as a link to Word the underling field codes look like this: {LINK Excel.Sheet.8 "C:\Users\myAccount\Documents\testexcel.xlsx" "Sheet2!R1C1:R8C2" \a \p} The codes refer to a fixed area (e.g., Sheet2!R1C1:R8C2 ) of the Excel spreadsheet, and so when the pivot table expands, the expanded cells fall outside the area that is defined in the field codes. Is there some way to have the link refer to the pivot table itself rather than the cell range that happened to be originally copied over from Excel?

    Read the article

  • Read from file into pointer to struct

    - by cla barzu
    I need help with pointers in C. I have to read from a file, and fill an array with pointers to struct rcftp_msg . Since now I did the next things: struct rcftp_msg { uint8_t version; uint8_t flags; uint16_t len; uint8_t buffer[512]; }; struct rcftp_msg *windows [10]; pfile = fopen(file,"r"); // Open the file I have to read from the file into the buffer, but I don't know how to do it. I tried the next: for (i = 0; i <10; i++){ leng=fread (**windows[i]->buffer**,sizeof(uint8_t),512,pfile); } I think windows[i]-buffer is bad, cuz that don't work. Sorry for my bad English :(

    Read the article

  • Updating table row by given id with jQuery

    - by fabrik
    Hello all! I need to update a specific table row (via tr id="unique_key") after a successful AJAX call. HTML fragment: <a name=\"p{$product_id}\" class=\"tr{$product_id}\"></a> <tr id="p{product_id}" class="item-row"> <td><h3>{product_id}</h3><a rel="facebox" href="ajax_url">{product_name}</a></td> <td>{image_information}</td> <td>{image_sortiment}</td> <td>{product_status}</td> </tr> Javascript: // AJAX Call success: function(msg){ $('#p' + prod_id).remove(); $('.tr' + prod_id).after(msg); $('#p' + prod_id + ' a[rel*=facebox]').facebox(); } ... What happens: The table row removed Anchors grouped into one single row (not before their <tr>'s) so my 'hook' disappears AJAX result inserted over the whole table (after my 'hook' but still a wrong place) What's wrong with my idea? How can i force jQuery to 'overwrite' the required table row?

    Read the article

  • Read Velocity Tokens/Tag from .vm file

    - by user1801660
    I have an application where in I am trying to create a velocity template repository which will help me centralise all my email templates and will allow me to create a communication hub. All templates will be called at runtime and populates with data via services. My problem is that I need to provide users with optional and compulsory params list when they define the template inputs for the velocity template. Is there a way to read the tokens/tags from the velocity template file and extract them?? Like I want a list of tokens $name.address.streetName to be available to me from .vm file. I do not want to go for Regex . I do not have to cache or reuse them , its just going to be a one time read and store the default,compulsory & optional params in the database. I am following these patterns : http://kickjava.com/src/org/apache/velocity/test/view/TemplateNodeView.java.htm How to use String as Velocity Template? Please advice.

    Read the article

  • Hook to make Subversion Read Only for specific users

    - by Shane
    We have an existing Subversion repository that uses LDAP to manage users/passwords. There are some new users who we would like to provide read-only access to SVN. I did some Google searches and found a way to open up read-only access to anonymous users, but this is not what we want. We do not want to open up SVN to everyone. We still want to control login through LDAP, but we would like to prevent certain named users from being able to add/edit/delete. I am assuming this can be done with a hook (pre-commit?), but I have no experience writing hooks. Can someone show me or point me to an example of how to do this?

    Read the article

  • Pivot table from multiple spreadsheets

    - by vrao
    I am using excel 2010. I am trying to create pivot table between two worksheets 'Summary' and 'Summary2'. I have identical row of data ranging from cells B5 to F5 in row 5 in both worksheets. Data in the two worksheets looks like this: Summary worksheet: Issues,20,3,4,5 Summary2 worksheet: Issues,10,0,3,9 Worksheet referes to issues from location 1 and worksheet referes to issues from location 2. Col B has title 'issues', Col C refers to issues of customer 1, Col D refers to issues of customer 2, Col E refers to issues of customer 3, Col F refers to issues of customer 4 I go to a third worksheet and start pivot table and in the table range I give this: 'Summary:Summary2'!$B$5:$F$5. Then I Say OK. Gives error "data reference source is not valid". Can someone tell me how to select the row from two different worksheet in pivot table? Also I want to be able to add issues of customers between two locations and get % completion for each locaiton. Can someone please help?

    Read the article

  • Using before_create in Rails to normalize a many to many table

    - by weotch
    I am working on a pretty standard tagging implementation for a table of recipes. There is a many to many relationship between recipes and tags so the tags table will be normalized. Here are my models: class Recipe < ActiveRecord::Base has_many :tag_joins, :as => :parent has_many :tags, :through => :tag_joins end class TagJoin < ActiveRecord::Base belongs_to :parent, :polymorphic => true belongs_to :tag, :counter_cache => :usage_count end class Tag < ActiveRecord::Base has_many :tag_joins, :as => :parent has_many :recipes, :through => :tag_joins, :source => :parent , :source_type => 'Recipe' before_create :normalizeTable def normalizeTable t = Tag.find_by_name(self.name) if (t) j = TagJoin.new j.parent_type = self.tag_joins.parent_type j.parent_id = self.tag_joins.parent_id j.tag_id = t.id return false end end end The last bit, the before_create callback, is what I'm trying to get working. My goal is if there is an attempt to create a new tag with the same name as one already in the table, only a single row in the join table is produced, using the existing row in tags. Currently the code dies with: undefined method `parent_type' for #<Class:0x102f5ce38> Any suggestions?

    Read the article

  • Is it possible to have a mysql table accept a null value for a primary_key column referencing a diff

    - by Dr.Dredel
    I have a table that has a column which holds the id of a row in another table. However, when table A is being populated, table B may or may not have a row ready for table A. My question is, is it possible to have mysql prevent an invalid value from being entered but be ok with a NULL? or does a foreign key necessitate a valid related value? So... what I'm looking for (in pseudo code) is this: Table "person" id | name Table "people" id | group_name | person_id (foreign key id from table person) insert into person (1, 'joe'); insert into people (1, 'foo', 1)//kosher insert into people (1, 'foo', NULL)//also kosher insert into people(1, 'foo', 7)// should fail since there is no id 7 in the person table. The reason I need this is that I'm having a chicken and egg issue where it makes perfect sense for the rows in the people table to be created before hand (in this example, I'm creating the groups and would like them to pre-exist the people who join them). And I realize that THIS example is silly and I would just put the group id in the person table rather than vice-versa, but in my real-world problem that is not workable. Just curious if I need to allow any and all values in order to make this work, or if there's some way to allow for null.

    Read the article

  • Using read in bash without empty fields collapsing

    - by Charles Duffy
    I'm trying to read a multi-line tab-separated file in bash. The format is such that empty fields are expected. Unfortunately, the shell is collapsing together field separators which are next to each other, as so: # IFS=$'\t' # read one two three <<<$'one\t\tthree' # printf '<%s> ' "$one" "$two" "$three"; printf '\n' <one> <three> <> ...as opposed to the desired output of <one> <> <three>. Can this be resolved without resorting to a separate language (such as awk)?

    Read the article

  • Inserting Row in Table inside Form tag autosubmitting in firefox/chrome

    - by user1861489
    I have a form that will have dynamic elements inserted with javascript and am experiencing some strange behavior. When I click the button to add another element to the table in the form, it adds the element but seems to to a form post immediately (without intending to submit the form yet) I have created a simplified example of the page that has the same behavior. the first table element is created on page load and subsequent elements are added when clicking on the button. this form works successfully in IE. does anyone have an idea of how to prevent this behavior? here is the code sample. <!DOCTYPE html> <html> <head> <title>Test Creating Form</title> <meta http-equiv="Content-type" content="text/html;charset=UTF-8"> <style type="text/css"> td{font-family:verdana;} </style> <script type="text/javascript"> var counter = 0; function makeTitle(title){ if(counter){ title += " " + counter; } counter++; var tbl = document.getElementById('tbl'); var tr = tbl.insertRow(-1) var td1 = tr.insertCell(-1); td1.innerHTML = title; } function load1(){ makeTitle('Primary Specimen'); } </script> </head> <body onload="load1();"> <form action="formtest.htm" method="post" name="testForm" id="testForm"> <table id="tbl" border="1"></table> <button onclick="makeTitle('Alternate Specimen')" id="clone" >Add Another Specimen</button> </form> </body> </html>

    Read the article

  • Concurrent usage of table causing issues

    - by Sven
    Hello In our current project we are interfacing with a third party data provider. They need to insert data in a table of ours. This inserting can be frequent every 1 min, every 5min, every 30, depends on the amount of new data they need to provide. The use the isolation level read committed. On our end we have an application, windows service, that calls a webservice every 2 minutes to see if there is new data in this table. Our isolation level is repeatable read. We retrieve the records and update a column on these rows. Now the problem is that sometimes this third party provider needs to insert a lot of data, let's say 5000 records. They do this per transaction (5rows per transaction), but they don't close the connection. They do one transaction and then the next untill all records are inserted. This caused issues for our process, we receive a timeout. If this goes on for a long time the database get's completely unstable. For instance, they maybe stopped, but the table somehow still stays unavailable. When I try to do a select on the table, I get several records but at a certain moment I don't get any response anymore. It just says retrieving data but nothing comes anymore until I get a timeout exception. Only solution is to restart the database and then I see the other records. How can we solve this. What is the ideal isolation level setting in this scenario?

    Read the article

  • read text file line by line and insert/update values in table

    - by I__
    i am exploring the option of whether DoCmd.TransferText will do what i need, and it seems like it wont. i need to insert data if it does not exist and update it if it does exist i am planning to read a text file line by line like this: Dim intFile As Integer Dim strLine As String intFile = FreeFile() Open myFile For Input As #intFile Line Input #intFile, strLine Close #intFile i guess each individual line will be a record. it will probably be comma separated and some fields will have a " text qualifier because within the field itself i will have commas my question is how would i read a comma delimited text file that has double quotes sometimes as text qualifiers into a table in access?

    Read the article

< Previous Page | 106 107 108 109 110 111 112 113 114 115 116 117  | Next Page >