Search Results

Search found 7311 results on 293 pages for 'rows'.

Page 197/293 | < Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >

  • Animating Reloading of UITableView

    - by ustun
    I am trying to animate table rows in a UITableView in an iPhone project as I swipe across the screen to reload the data. When I disable animations and only call reloadData, table continues responding to swipe gestures. When I add animations with the reloadSections:WithRowAnimation: method, table stops responding to swipes, and only the navigation bar at the top responds to swipes. Another change is that, table starts responding to selection and I have to manually disable it again. I suspect these two issues might be related. I am using the swipe detection code over here btw: 1

    Read the article

  • Best approach to show big amount of "grid" data

    - by Jorge Ramírez
    Hello all. I am building an application for Android (1.5) that, after quering a webservice, shows to the user a big amount of data that should be displayed in a "grid" or "table" style. I must show a result of about 7 columns and 50 rows (for example a customer list with names, adresses, telephone number, sales amount last year and so). Obviously, the 7 columns will not fix in the screen and I would like the user would be able to scroll up/down and LEFT/RIGHT (important because of the number of columns) to explore the grid results. cell selection level is NOT necessary, as much I would need row selection level. What is the best approach to get this interface element? Listview / GridView / TableLayout? Thanks

    Read the article

  • Speeding up jQuery empty() or replaceWith() Functions When Dealing with Large DOM Elements

    - by Levi Hackwith
    Let me start off by apologizing for not giving a code snippet. The project I'm working on is proprietary and I'm afraid I can't show exactly what I'm working on. However, I'll do my best to be descriptive. Here's a breakdown of what goes on in my application: User clicks a button Server retrieves a list of images in the form of a data-table Each row in the table contains 8 data-cells that in turn each contain one hyperlink Each request by the user can contain up to 50 rows (I can change this number if need be) That means the table contains upwards of 800 individual DOM elements My analysis shows that jQuery("#dataTable").empty() and jQuery("#dataTable).replaceWith(tableCloneObject) take up 97% of my overall processing time and take on average 4 - 6 seconds to complete. I'm looking for a way to speed up either of the above mentioned jQuery functions when dealing with massive DOM elements that need to be removed / replaced. I hope my explanation helps.

    Read the article

  • Analysis Services with excel as front end - is it possible to get the nicer UI that powerpivot provi

    - by AJM
    I have been looking into PowerPivot and concluded that for "self service BI" and ahoc buidling of cubes it has its uses. In particular I like the enhanced UI that you get from using PowerPivot rather than just using a PivotTable hooked up to an analysis services datasource. However it seems that hooking up PowerPivot to an existing analysis services cube is not a solution for "organisational BI". It is not always desireable to suck millions of rows into excel at once and the interface between PowerPivot and analysis services is very poor in my book. Hence the question is can an existing analysis services solution get the enhanced ui features that power pivot brings, withoout using powerpivot as the design tool? If powerpivot is aimed ad self service/personal BI then it seems bizare that the UI for this is better than for bigger/more costly analysis services solutions.

    Read the article

  • Duplicate partitioning key performance impact

    - by Anshul
    I've read in some posts that having duplicate partitioning key can have a performance impact. I've two tables like: CREATE TABLE "Test1" ( CREATE TABLE "Test2" ( key text, key text, column1 text, name text, value text, age text, PRIMARY KEY (key, column1) ... ) PRIMARY KEY (key, name,age) ) In Test1 column1 will contain column name and value will contain its corresponding value.The main advantage of Test1 is that I can add any number of column/value pairs to it without altering the table by just providing same partitioning key each time. Now my question is how will each of these table schema's impact the read/write performance if I've millions of rows and number of columns can be upto 50 in each row. How will it impact the compaction/repair time if I'm writing duplicate entries frequently?

    Read the article

  • C# ListBox hide vertical scrollbar

    - by Codeffect
    How to hide the vertical scroll bar of a Listbox that is present inside a div. <td class="ctrlForm"> <div id="lstQueriesDiv" style="OVERFLOW:auto; Width: 650px; height:167px;" > <asp:ListBox ID="lstQueries" runat="server" CssClass="cssLstQueries" Rows="9" ></asp:ListBox> </div> </td> .cssLstQueries{ Width:auto; overflow: hidden; -ms-overflow-y : hidden; -ms-overflow-x : hidden; }

    Read the article

  • Subquery works in 9i but not in 11g

    - by Zsuetam
    Statement below is working on Oracle 9i but not on Oracle 11g SELECT * FROM ( SELECT 0 scrnfail_rate, '9' zz, 7 hh FROM DUAL UNION ALL SELECT 0 scrnfail_rate, '9' zz, 7 hh FROM DUAL ) WHERE zz IS NOT NULL AND TO_CHAR (hh) NOT IN ( SELECT DECODE ( scrnfail_rate, 0, -1, ROUND (LEVEL * 1 / (scrnfail_rate / 100)) - ROUND (1 / (2 * (scrnfail_rate / 100))) ) AS nno FROM DUAL WHERE NVL (scrnfail_rate, 0) > 0 CONNECT BY LEVEL <= ROUND(9 * scrnfail_rate / 100) ) It looks like Oracle 11g is ignoring where decode or even where clause in the subquery. This query should return two rows as it does on Oracle 9i, but results ORA-01476: divisor is equal to zero on Oracle 11g EE 11.2.0.1.0 - 64bit. Can anyone help? Thanks!

    Read the article

  • Autmatically create table on MySQL server based on date?

    - by Anthony
    Is there an equivalent to cron for MySQL? I have a PHP script that queries a table based on the month and year, like: SELECT * FROM data_2010_1 What I have been doing until now is, every time the script executes it does a query for the table, and if it exists, does the work, if it doesn't it creates the table. I was wondering if I can just set something up on the MySQL server itself that will create the table (based on a default table) at the stroke of midnight on the first of the month. Update Based on the comments I've gotten, I'm thinking this isn't the best way to achieve my goal. So here's two more questions: If I have a table with thousands of rows added monthly, is this potentially a drag on resources? If so, what is the best way to partition this table, since the above is verboten? What are the potential problems with my home-grown method I originally thought up?

    Read the article

  • How to use custom color for each textview in listview that extends SimpleAdapter in Android ?

    - by mob-king
    I have a listview with custom rows and that extends SimpleAdapter. Each row consist of two linear layouts : 1st having two textviews of which one is hidden in horizontal orientation, second having two textviews in horizontal orientation. Now depending on the value in hidden textview , I want to setcolor for the remaining items for the row. To put it as simple: each listview item has some custom colors the value of which comes from the hidden field. I have done this by overriding getview() for the simpleadapter and returning view for each, but this makes list very slow to render (and that I think is obvious as so much of work for each view before showing it). Can I do this in some more efficient way ? like making views and then add up to list instead of using xml layout maybe one solution OR any other ? Any help ? Thanks.

    Read the article

  • wpf style converter : "Convert" called by every datagrid column using it

    - by Sonic Soul
    I created a converter, and assigned it to a style. than i assigned that style, to the columns i want affected. as rows are added, and while stepping through debugger, i noticed that the converter convert method gets called 1 time per column (each time it is used). is there a way to optimize it better, so that it gets called only once and all columns using it get the same value? <Style x:Key="ConditionalColorStyle" TargetType="{x:Type DataGridCell}" BasedOn="{StaticResource CellStyle}"> <Setter Property="Foreground"> <Setter.Value> <Binding> <Binding.Converter> <local:ConditionalColorConverter /> </Binding.Converter> </Binding> </Setter.Value> </Setter> </Style>

    Read the article

  • Parameter passing Vs Table Valued Parameters Vs XML to SQL 2008 from .Net Application

    - by Harryboy
    As We are working on a asp .net project there three ways one can update data into database when there are multiple rows updation / insertion required Let's assume we need to update employee education detail (which could be 1,3,5 or 10 records) Method to Update Data Pass value as parameter (Traditional approach), If 10 records are there then 10 round trip required Pass data as xml and write logic inside your stored procedure to get that data from xml and update the table (only single roundtrip required) Use Table valued parameters (only single roundtrip required) Note : Data is available as List, so i need to convert it to xml or any other format if i need to pass. There are no. of places in entire application we need to update data in bulk (or multiple records) I just need your suggestions that Which method will be faster (please mention if there are some other overheads) Manageability or testability concern with any approach Any other bottleneck or issue with any of the approach (Serialization /Deserialization concern or limit on size of the data passing) Any other method you suggest for same operations Thanks

    Read the article

  • Are there ways to improve NHibernate's performance regarding entity instantiation?

    - by denny_ch
    Hi folks, while profiling NHibernate with NHProf I noticed that a lot of time is spend for entity building or at least spend outside the query duration (database roundtrip). The project I'm currently working on prefetches some static data (which goes into the 2nd level cache) at application start. There are about 3000 rows in the result set (and maybe 30 columns) that is queried in 75 ms. The overall duration observed by NHProf is about 13 SECONDS! Is this typical beheviour? I know that NHibernate shouldn't be used for bulk operations, but I didn't thought that entity instantiation would be so expensive. Are there ways to improve performance in such situations or do I have to live with it? Thx, denny_ch

    Read the article

  • Single Large v/s Multiple Small MySQL tables for storing Options

    - by Prasad
    Hi there, I'm aware of several question on this forum relating to this. But I'm not talking about splitting tables for the same entity (like user for example) Suppose I have a huge options table that stores list options like Gender, Marital Status, and many more domain specific groups with same structure. I plan to capture in a OPTIONS table. Another simple option is to have the field set as ENUM, but there are disadvantages of that as well. http://www.brandonsavage.net/why-you-should-replace-enum-with-something-else/ OPTIONS Table: option_id <will be referred instead of the name> name value group Query: select .. from options where group = '15' - Since this table is expected to be multi-tenant, the no of rows could grow drastically. - I believe splitting the tables instead of finding by the group would be easier to write & faster to execute. - or perhaps partitioning by the group or tenant? Pl suggest. Thanks

    Read the article

  • Dynamic generated file upload control<using javascript> doesn't post?

    - by udaya
    Hai I am having a form which contains a filetype like this on submit i am calling a script function addRowToTable() { var tbl = document.getElementById('uploadTab'); var lastrow = tbl.rows.length; var iteration = lastrow; var row = tbl.insertRow(lastrow); var cell2 = row.insertCell(0); var e2 = document.createElement('input'); e2.type = 'file'; e2.name = 'ufile[]'; e2.id = 'ufile[]'; e2.size='50'; cell2.appendChild(e2); } This script generates The tr on a button click... In my view generatedsource tool i get the "" like this <tr><td><input size="50" id="ufile[]" name="ufile[]" type="file"></td></tr> when i submit the form i dont get the file name for the generated file type in my view page But i get the file name foe the one that is default What may be the problem?

    Read the article

  • Copying Some from a PostgreSQL Server to Another

    - by whollychao
    I am in need of an application that can periodically transmit select rows from a PostgreSQL database across a network to a second PostgreSQL server. Typically these will be the most recent row added, pulled and transmitted every 10-30 seconds. The primary servers run in a MS Windows environment with a high-latency, and occasionally intermittent, network connection. Therefore, any application would have to be tolerant of this and ideally automatically reconnect / resend data that could not be transmitted. Due to the environment and the requirements, a full-blown replication package would be unnecessary. I appreciate any help anyone has with this problem.

    Read the article

  • How to impose maxlength on textArea in HTML , Javascript

    - by Rakesh Juyal
    I would like to have some functionality by which if i write <textarea maxlength="50"></textarea> <textarea maxlength="150"></textarea> <textarea maxlength="250"></textarea> it will automatically impose the maxlength on the textArea. If possible please donot provide the solution in JQuery. Note: This can be done if i do something like this: <textarea onkeypress="return imposeMaxLength(event, this, 110);" rows="4" cols="50"> function imposeMaxLength(Event, Object, MaxLen) { return (Object.value.length <= MaxLen)||(Event.keyCode == 8 ||Event.keyCode==46||(Event.keyCode>=35&&Event.keyCode<=40)) } copied from another thread But the point is I don't want to write onKeyPress and onKeyUp every time i declare a textArea.

    Read the article

  • EOF of excel in vb6

    - by Mark
    how do i write the code in vb6 in finding the EOF of excel file can anyone help me? i try to code this and it works.. --- Dim excelApp as Excel.Application Dim excelWB as Excel.Workbook Set excelApp = New Excel.Application Set excelWB = excelApp.Workbooks.Open("D:\Book1.xls") Dim xlsRow as Long Dim EOF as Boolean xlsRow = 1 Do While (EOF = False) If (excelWB.Sheets("Sheet1").Cells(xlsRow, 1).Value = "") Then EOF = True Else xlsRow = xlsRow + 1 End If Loop <--- this code is working, but the only problem is only the column 1 will be checked and the others is not. Can anyone help me on how to improve this code to check all rows and column of excel cells.

    Read the article

  • Querying Two Tables At Once

    - by John
    Hello, I am trying to do what I believe is called a join query. First, in a MySQL table called "login," I want to look up what "loginid" is in the record where "username" equals $profile. (This will be just one record / row in the MySQL table). Then, I want to take that "loginid" and look up all rows / records in a different MySQL table called "submission," and pull data that have that "loginid." This could possibly be more than one record / row. How do I do this? The code below doesn't seem to work. Thanks in advance, John $profile = mysql_real_escape_string($_GET['profile']); $sqlStr = "SELECT l.username, l.loginid, s.loginid, s.submissionid, s.title, s.url, s.datesubmitted, s.displayurl FROM submission AS s, login AS l WHERE l.username = '$profile', s.loginid = l.loginid ORDER BY s.datesubmitted DESC";

    Read the article

  • Styling 15 mintues slots in RedSchedular

    - by user296386
    Is there any way to change the colour of 15 mintues slot's . I found the article in which it's showed how the change the colour for an hour slot. below is my code. please can help me how to change the colour of 15 minutes slots. thanks protected void RadScheduler1_TimeSlotCreated(object sender, Telerik.Web.UI.TimeSlotCreatedEventArgs e) { if (dsAppointments == null) { dsAppointments = GetAppointments( this.RadScheduler1.SelectedDate); } foreach (DataRow row in dsAppointments.Tables[0].Rows) { DateTime start = Convert.ToDateTime(row["start"]); DateTime end = Convert.ToDateTime(row["end"]); if (e.TimeSlot.Resource.Text == Convert.ToString(row["StaffName"])) { if ((e.TimeSlot.Start.Date.ToShortDateString() == start.ToShortDateString()) && ((e.TimeSlot.Start.Hour = start.Hour && e.TimeSlot.End.Hour <= end.Hour ))) { e.TimeSlot.CssClass = "Disabled"; } } } }

    Read the article

  • Delete a Row from a DataGridView given its index

    - by Ruben Trancoso
    My DataGridView is a single line selection and theres a rowEnter Event where I get the line index every time the selected line changes. private void rowEnter(object sender, DataGridViewCellEventArgs e) { currentRowIndex = e.RowIndex; } when I press a delete button I use the same index to delete the row myDataSet.Avaliado.Rows[currentRowIndex].Delete(); avaliadoTableAdapter.Update(myDataSet.Avaliado); it works fine if no column in the DataGridView is sorted, otherwise a get an error. What should be the way to know the row index in the dataset that corresponds to the rowindex from the DataGridView?

    Read the article

  • Parsing a CSV File to a Rails Database

    - by Schroedinger
    G'day guys, I'm using fasterCSV and a rake script to parse a csv with about 30 columns into my rails db for a 'Trade' item. The script works fine when all of the values are set to strings, but when I change it to a decimal, int or other value, everything goes to hell. Wondering if fasterCSV has built in int etc parsing or whether I'll have to manage these within my model. Basically, I'm given a giant amount of trades data, need to import it, and then need to provide feedback with say the average trade volume, the times, etc. I understand I can do that all with the wonderful records provided to me by activeRecord but wondered if there was an easier way to populate a rather large Database with a given CSV? Several of the fields don't have values for certain rows, fasterCSV seems to work perfectly when they're all strings, but not when I try to get decimal or other.

    Read the article

  • Problem with returning values from a helper method in Rails

    - by True Soft
    I want to print some objects in a table having 2 rows per object, like this: <tr class="title"> <td>Name</td><td>Price</td> </tr> <tr class="content"> <td>Content</td><td>123</td> </tr> I wrote a helper method in products_helper.rb, based on the answer of this question. def write_products(products) products.map { |product| content_tag :tr, :class => "title" do content_tag :td do link_to h(product.name), product, :title=>product.name end content_tag :td do product.price end end content_tag :tr, :class => "content" do content_tag :td, h(product.content) content_tag :td, product.count end }.join end But this does not work as expected. It only returns the last node - the last <td>123</td> What should I do to make it work?

    Read the article

  • Code/Approach Golf: Find row in text file with too many columns

    - by awshepard
    Given a text file that is supposed to contain 10 tab-delimited columns (i.e. 9 tabs), I'd like to find all rows that have more than 10 columns (more than 9 tabs). Each row ends with CR-LF. Assume nothing about the data, field widths, etc, other than the above. Comments regarding approach, and/or working code would be extremely appreciated. Bonus for printing line numbers of offending lines as well. Thanks in advance!

    Read the article

  • 50 sequences in one line

    - by user343934
    I have Muttiple sequence alignment (clustal) file and i want to read this file and arrange sequences in a such a way that in looks more clear and precise in order. I am doing this from biopython using AlignIO object. My codes like this alignment = AlignIO.read("opuntia.aln", "clustal") print "Number of rows: %i" % len(align) for record in alignment: print "%s - %s" % (record.id, record.seq) My Output-- http://i48.tinypic.com/ae48ew.jpg , it looks messy and long scrolling. What i want to do is print only 50 sequences in each line and continue till the end of alignment file. I wish to have output like this---http://i45.tinypic.com/4vh5rc.jpg from --http://www.ebi.ac.uk/Tools/clustalw2/, sorry two links are just a text due to my reputation. Any suggestions, algorithm and sample code is appreciated Thanks in advance Br,

    Read the article

  • W3 Validation errors..

    - by Kyle Sevenoaks
    I have 12 errors, but some are just pure non existent. Doctype: <!DOCTYPE html PUBLIC "-//W3C//DTD XHTML 1.0 Transitional//EN" "http://www.w3.org/TR/xhtml1/DTD/xhtml1-transitional.dtd"> Here's the error report, but the "duplicate specification of attribute "value"" simply isn't true according to my .tpl: {textfield class="quetext" value="Epost*" onblur="if(this.value=='') this.value='Epost*';" onfocus="if(this.value=='Epost*') this.value='';"} Also, does a textarea require the attribute "rows" and "cols"? I thought that was only for tables? And I don't understand what the two errors at the end mean: Line 586, Column 80: Attribute value redefined... Please help! Thanks :) (Sorry if things chop and change, I'm working on the valdiation now, to tidy up as many errors as possible.)

    Read the article

< Previous Page | 193 194 195 196 197 198 199 200 201 202 203 204  | Next Page >