Search Results

Search found 7311 results on 293 pages for 'rows'.

Page 219/293 | < Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >

  • Drupal Views: Render Null Result for Relationship as 0

    - by Kyle S
    I have a View configured in Drupal to return nodes, sorting them by their average vote in descending order. For the purpose of the View, the value of the average votes is a Relationship. I noticed that nodes with no votes are displayed after nodes with a negative average. Nodes with no votes should have an average of 0, but I believe the MySQL JOIN is causing NULL values to be returned (as there are no matching rows in the joined table, since a row is created after the first vote is cast for that item). I discovered that with MySQL it is possible to output all values that are NULL in a column as another value with IFNULL(column_name,'other value'). I feel like I would need to modify the Views module in order to obtain this functionality, but I'm hoping that there is some sort of option that returns NULL values in a relation (a relation doesn't exist for the item) as 0 instead of NULL, so that I can properly sort the nodes. The modules I am using include Views, Voting API, Vote Up/Down, and CTools. Thanks.

    Read the article

  • Re-use aliased field in SQL SELECT STATEMENT

    - by Bad Display Name
    Hi, the question is hard to define without giving an example, e.g. I'd like to achieve something like this : SELECT (CASE WHEN ...) AS FieldA, 20 + FieldA AS FieldB FROM Tbl Assuming that by "..." I've replaced a long and complex CASE statement, hence I don't want to repeat it when selecting FieldB and use the aliased FieldA instead. IS THIS EVEN POSSIBLE? What could be a workaround? Note, that this will return multiple rows , hence the DECLARE/SET outside the SELECT statement is no good in my case. PLEASE HELP. P.S. I am using SQL 2008

    Read the article

  • Read [for xml auto, elements] into DataTable in ADO.NET

    - by ihorko
    Hello All! I have MS SQL Server stored procedure that returns XML (it uses SELECT with for xml auto, elements) I tried read it into DataTable: DataTable retTable = new DataTable(); SqlCommand comm = new SqlCommand("exec MySP", connection); SqlDataAdapter da = new SqlDataAdapter(comm); connection.Open(); da.Fill(retTable); but retTable contains 12 rows with separated full xml thar SQL Server returns. How can I read that XML from DB into DataTable object? Thanks!

    Read the article

  • Extendable accessing of sqlite database on android platform

    - by mscriven
    Hi, I am fairly new to the android sdk and databases and have been searching for an answer to this quite some time. I am trying to build an app which has multiple tables within a database. e.g. one for weapons, armours etc. However, my DatabaseManager class which handles all of my table creating, DatabaseHelper inner class and populating of data is creating for an extremely large class requiring high maintenance. Every time I would like to add or remove a table column I need to change quite a few areas of code, - Every reference to the addition of a row in that table with data - The method that the above calls - The method returning all of the database rows - The code in the helper class creating the table - Any specific update methods My question is this: Surely there must be some better way of coding this system, maybe using a database isn't the best way to go, or am i just not used to such large classes having only learned java at university and my largest class consisting of a mere 400-600 lines of code. Thanks for any help!

    Read the article

  • Error " Index exceeds Matrix dimensions"

    - by Mola
    Hi experts, I am trying to read an excel 2003 file which consist of 62 columns and 2000 rows and then draw 2d dendrogram from 2000 pattern of 2 categories of a data as my plot in matlab. When i run the script, it gives me the above error. I don't know why. Anybody has any idea why i have the above error? My data is here: http://rapidshare.com/files/383549074/data.xls Please delete the 2001 column if you want to use the data for testing. and my code is here: % Script file: cluster_2d_data.m d=2000; n1=22; n2=40; N=62 Data=xlsread('data.xls','A1:BJ2000'); X=Data'; R=1:2000; C=1:2; clustergram(X,'Pdist','euclidean','Linkage','complete','Dimension',2,... 'ROWLABELS',R,'COLUMNLABELS',C,'Dendrogram',{'color',5})

    Read the article

  • RoR: Condition Always False - Why?

    - by Matt Hollingsworth
    Working in RoR 2.3.x. My quiz_results table has a row for user_id (3907) and result (0.1), and two users I'm looking at with no rows in the quiz_results table. This line keeps returining false: -if QuizResult.find_by_user_id(@user_id).present? But if I change it to anything that returns true, the next line reports an error on the * method: ="#{(QuizResult.average('score', :conditions => 'user_id = #{@user.id}') * 100).round}%" The beginning of the code is a loop: [email protected] do |user| Any ideas how to fix? Have tried unsuccessfully all day.

    Read the article

  • Export XML with only one MySQL request ?

    - by mere-teresa
    I want to export in XML format some data from 7 tables (MySQL database), and then I want to import in another database. And I have a update or insert rule for data. I already have a SQL query retrieving all data, with JOINs on my 7 tables. But...when I try to put data in XML format, I reach a limit. My PHP loop can catch each row, but I would like to benefit from hierachical structure of the XML, and all I have are rows with the same data repeated. It is better to query once and to construct the XML tree in PHP or to query each time I want access to a lower level ?

    Read the article

  • Is it possible to implement a lightweight database using Blackberry Persistance options?

    - by Tobias
    First I would like to explain why I want to use alternate Blackberry Persistance options rather than a Blackberry database itself, say SQLite. The reason is that the application i'm designing, I want it to be used in all the previous versions of Blackberry rather than just the ones having OS 5.0 or greater. Now, coming back to the actual question, I have got a database that I want to replicate to be used in the Blackberry application. The database has 8 tables and each table has approximately 12 different columns. One of the table has 1000 rows. Now if I was to implement this DB for a Blackberry application , keeping in mind that it will work on all the versions of Blackberry, what would be the best way to implement it?

    Read the article

  • How to easily get the unmatched condition in mysql

    - by leivli
    I have a "server" table which has a column named 'SN' in mysql, when do query to retrive servers with some sns from 'sn1' to 'sn10000', we can: select * from server where sn in ('sn1','sn2','sn3',...'sn10000'); If there is only one sn in 'sn1'-'sn10000' which not exists in database, then the query above will retrive 9999 rows of result. The question is how can I easily get which one in 'sn1'-'sn10000' is not exists in database except the additional work, such as handling the result with shell script etc. I have an ugly sql like below can use: select * from (select 'sn1' as sn union select 'sn2' union select 'sn3' .... union select 'sn10000') as SN where not exists (select id from server where server.sn=SN.sn); Is Anyone has other better methods? Thanks.

    Read the article

  • Fastest way to move records from an Oracle database into SQL Server

    - by user347748
    Ok this is the scenario... I have a table in Oracle that acts like a queue... A VB.net program reads the queue and calls a stored proc in SQL Server that processes and then inserts the message into another SQL Server table and then deletes the record from the oracle table. We use a DataReader to read the records from Oracle and then call the stored proc for each of the records. The program seems to be a little slow. The stored procedure itself isn't slow. The SP by itself when called in a loop can process about 2000 records in 20 seconds. But when called from the .Net program, the execution time is about 5 records per second. I have seen that most of the time consumed is in calling the stored procedure and waiting for it to return. Is there a better way of doing this? Here is a snippet of the actual code Function StartDataXfer() As Boolean Dim status As Boolean = False Try SqlConn.Open() OraConn.Open() c.ErrorLog(Now.ToString & "--Going to Get the messages from oracle", 1) If GetMsgsFromOracle() Then c.ErrorLog(Now.ToString & "--Got messages from oracle", 1) If ProcessMessages() Then c.ErrorLog(Now.ToString & "--Finished Processing all messages in the queue", 0) status = True Else c.ErrorLog(Now.ToString & "--Failed to Process all messages in the queue", 0) status = False End If Else status = True End If StartDataXfer = status Catch ex As Exception Finally SqlConn.Close() OraConn.Close() End Try End Function Private Function GetMsgsFromOracle() As Boolean Try OraDataAdapter = New OleDb.OleDbDataAdapter OraDataTable = New System.Data.DataTable OraSelCmd = New OleDb.OleDbCommand GetMsgsFromOracle = False With OraSelCmd .CommandType = CommandType.Text .Connection = OraConn .CommandText = GetMsgSql End With OraDataAdapter.SelectCommand = OraSelCmd OraDataAdapter.Fill(OraDataTable) If OraDataTable.Rows.Count > 0 Then GetMsgsFromOracle = True End If Catch ex As Exception GetMsgsFromOracle = False End Try End Function Private Function ProcessMessages() As Boolean Try ProcessMessages = False PrepareSQLInsert() PrepOraDel() i = 0 Dim Method As Integer Dim OraDataRow As DataRow c.ErrorLog(Now.ToString & "--Going to call message sending procedure", 2) For Each OraDataRow In OraDataTable.Rows With OraDataRow Method = GetMethod(.Item(0)) SQLInsCmd.Parameters("RelLifeTime").Value = c.RelLifetime SQLInsCmd.Parameters("Param1").Value = Nothing SQLInsCmd.Parameters("ID").Value = GenerateTransactionID() ' Nothing SQLInsCmd.Parameters("UID").Value = Nothing SQLInsCmd.Parameters("Param").Value = Nothing SQLInsCmd.Parameters("Credit").Value = 0 SQLInsCmd.ExecuteNonQuery() 'check the return value If SQLInsCmd.Parameters("ReturnValue").Value = 1 And SQLInsCmd.Parameters("OutPutParam").Value = 0 Then 'success 'delete the input record from the source table once it is logged c.ErrorLog(Now.ToString & "--Moved record successfully", 2) OraDataAdapter.DeleteCommand.Parameters("P(0)").Value = OraDataRow.Item(6) OraDataAdapter.DeleteCommand.ExecuteNonQuery() c.ErrorLog(Now.ToString & "--Deleted record successfully", 2) OraDataAdapter.Update(OraDataTable) c.ErrorLog(Now.ToString & "--Committed record successfully", 2) i = i + 1 Else 'failure c.ErrorLog(Now.ToString & "--Failed to exec: " & c.DestIns & "Status: " & SQLInsCmd.Parameters("OutPutParam").Value & " and TrackId: " & SQLInsCmd.Parameters("TrackID").Value.ToString, 0) End If If File.Exists("stop.txt") Then c.ErrorLog(Now.ToString & "--Stop File Found", 1) 'ProcessMessages = True 'Exit Function Exit For End If End With Next OraDataAdapter.Update(OraDataTable) c.ErrorLog(Now.ToString & "--Updated Oracle Table", 1) c.ErrorLog(Now.ToString & "--Moved " & i & " records from Oracle to SQL Table", 1) ProcessMessages = True Catch ex As Exception ProcessMessages = False c.ErrorLog(Now.ToString & "--MoveMsgsToSQL: " & ex.Message, 0) Finally OraDataTable.Clear() OraDataTable.Dispose() OraDataAdapter.Dispose() OraDelCmd.Dispose() OraDelCmd = Nothing OraSelCmd = Nothing OraDataTable = Nothing OraDataAdapter = Nothing End Try End Function Public Function GenerateTransactionID() As Int64 Dim SeqNo As Int64 Dim qry As String Dim SqlTransCmd As New OleDb.OleDbCommand qry = " select seqno from StoreSeqNo" SqlTransCmd.CommandType = CommandType.Text SqlTransCmd.Connection = SqlConn SqlTransCmd.CommandText = qry SeqNo = SqlTransCmd.ExecuteScalar If SeqNo > 2147483647 Then qry = "update StoreSeqNo set seqno=1" SqlTransCmd.CommandText = qry SqlTransCmd.ExecuteNonQuery() GenerateTransactionID = 1 Else qry = "update StoreSeqNo set seqno=" & SeqNo + 1 SqlTransCmd.CommandText = qry SqlTransCmd.ExecuteNonQuery() GenerateTransactionID = SeqNo End If End Function Private Function PrepareSQLInsert() As Boolean 'function to prepare the insert statement for the insert into the SQL stmt using 'the sql procedure SMSProcessAndDispatch Try Dim dr As DataRow SQLInsCmd = New OleDb.OleDbCommand With SQLInsCmd .CommandType = CommandType.StoredProcedure .Connection = SqlConn .CommandText = SQLInsProc .Parameters.Add("ReturnValue", OleDb.OleDbType.Integer) .Parameters("ReturnValue").Direction = ParameterDirection.ReturnValue .Parameters.Add("OutPutParam", OleDb.OleDbType.Integer) .Parameters("OutPutParam").Direction = ParameterDirection.Output .Parameters.Add("TrackID", OleDb.OleDbType.VarChar, 70) .Parameters.Add("RelLifeTime", OleDb.OleDbType.TinyInt) .Parameters("RelLifeTime").Direction = ParameterDirection.Input .Parameters.Add("Param1", OleDb.OleDbType.VarChar, 160) .Parameters("Param1").Direction = ParameterDirection.Input .Parameters.Add("TransID", OleDb.OleDbType.VarChar, 70) .Parameters("TransID").Direction = ParameterDirection.Input .Parameters.Add("UID", OleDb.OleDbType.VarChar, 20) .Parameters("UID").Direction = ParameterDirection.Input .Parameters.Add("Param", OleDb.OleDbType.VarChar, 160) .Parameters("Param").Direction = ParameterDirection.Input .Parameters.Add("CheckCredit", OleDb.OleDbType.Integer) .Parameters("CheckCredit").Direction = ParameterDirection.Input .Prepare() End With Catch ex As Exception c.ErrorLog(Now.ToString & "--PrepareSQLInsert: " & ex.Message) End Try End Function Private Function PrepOraDel() As Boolean OraDelCmd = New OleDb.OleDbCommand Try PrepOraDel = False With OraDelCmd .CommandType = CommandType.Text .Connection = OraConn .CommandText = DelSrcSQL .Parameters.Add("P(0)", OleDb.OleDbType.VarChar, 160) 'RowID .Parameters("P(0)").Direction = ParameterDirection.Input .Prepare() End With OraDataAdapter.DeleteCommand = OraDelCmd PrepOraDel = True Catch ex As Exception PrepOraDel = False End Try End Function WHat i would like to know is, if there is anyway to speed up this program? Any ideas/suggestions would be highly appreciated... Regardss, Chetan

    Read the article

  • Merge two excel files (with the condition)

    - by chennai
    I have a form in access in which i have two text boxes which accepts two excel files with a button click. now when i click generate button an output excel file has to be generated or created based on the following conditions In one excel file i have these data : id code country count t100 gb123 india 3123 t100 gh125 UK 1258 t123 ytr15 USA 1111 t123 gb123 Germany 100 t145 gh575 india 99 t458 yt777 USA 90 In another excel file i have these data country location India delhi UK london USA wallstreet Germany frankfurt The rows can be more than what i mentioned here ... now i want to merge them according to the country. In book1 excel file for example wherever you find country india the location field delhi has to be inserted right beside the country field and it has to be done for each and every country which i mentioned in book2 excel file and the output file has to be sorted according to the count at last. For example the output file should like this id code country count Location t100 gb123 india 3123 delhi t100 gh125 UK 1258 london t123 ytr15 USA 1111 wallstreet t123 gb123 Germany 100 frankfrt t145 gh575 india 99 delhi t458 yt777 USA 90 wallstreet

    Read the article

  • optimizing oracle query

    - by deming
    I'm having a hard time wrapping my head around this query. it is taking almost 200+ seconds to execute. I've pasted the execution plan as well. SELECT user_id , ROLE_ID , effective_from_date , effective_to_date , participant_code , ACTIVE FROM CMP_USER_ROLE E WHERE ACTIVE = 0 AND (SYSDATE BETWEEN effective_from_date AND effective_to_date OR TO_CHAR(effective_to_date,'YYYY-Q') = '2010-2') AND participant_code = 'NY005' AND NOT EXISTS ( SELECT 1 FROM CMP_USER_ROLE r WHERE r.USER_ID= E.USER_ID AND r.role_id = E.role_id AND r.ACTIVE = 4 AND E.effective_to_date <= (SELECT MAX(last_update_date) FROM CMP_USER_ROLE S WHERE S.role_id = r.role_id AND S.role_id = r.role_id AND S.ACTIVE = 4 )) Explain plan ----------------------------------------------------------------------------------------------------- | Id | Operation | Name | Rows | Bytes | Cost (%CPU)| Time | ----------------------------------------------------------------------------------------------------- | 0 | SELECT STATEMENT | | 1 | 37 | 154 (2)| 00:00:02 | |* 1 | FILTER | | | | | | |* 2 | TABLE ACCESS BY INDEX ROWID | USER_ROLE | 1 | 37 | 30 (0)| 00:00:01 | |* 3 | INDEX RANGE SCAN | N_USER_ROLE_IDX6 | 27 | | 3 (0)| 00:00:01 | |* 4 | FILTER | | | | | | | 5 | HASH GROUP BY | | 1 | 47 | 124 (2)| 00:00:02 | |* 6 | TABLE ACCESS BY INDEX ROWID | USER_ROLE | 159 | 3339 | 119 (1)| 00:00:02 | | 7 | NESTED LOOPS | | 11 | 517 | 123 (1)| 00:00:02 | |* 8 | TABLE ACCESS BY INDEX ROWID| USER_ROLE | 1 | 26 | 4 (0)| 00:00:01 | |* 9 | INDEX RANGE SCAN | N_USER_ROLE_IDX5 | 1 | | 3 (0)| 00:00:01 | |* 10 | INDEX RANGE SCAN | N_USER_ROLE_IDX2 | 957 | | 74 (2)| 00:00:01 | -----------------------------------------------------------------------------------------------------

    Read the article

  • [SQL] Query returning more than one row with the same name

    - by Neutralise
    I am having trouble with an SQL query returning more than one row with the same name, using this query: SELECT * FROM People P JOIN SpecialityCombo C ON P.PERSONID = C.PERSONID JOIN Speciality S ON C.GROUPID = S.ID; People contains information on each person, Specialty contains the names and ID of each specialty and SpecialityCombo contains information about the associations between People and their Speciality, namely each row has a PERSONID and a Speciality ID (trying to keep it normalised to some extent). My query works in that it returns each Person and the name of their specialty, but it returns n rows for the number of specialitys they want, because each specialty returns the same row 'name'. What I want is it to return just one row containing each speciality. How can I do this?

    Read the article

  • In Selenium IDE, can I store an element to use in subsequent asserts?

    - by Colin Fine
    I've just started using Selenium-IDE (not looked at selenium-RC yet: if somebody tells me that that is the answer to my question I'll look at it) One of the operations I'm testing generates some output in a table in the next HTML page, but the order of the rows is not predictable. I can obviously use 'assertTextPresent', but I want to do a bit more, and check that various bits of text are in the same row. What I would like to be able to do is to identify a tr by some content, and then use that tr in subsequent asserts; something like storeExpression //table[@id='TABLE_6']/td[.='case_1']/.. row assertText ${row} 'Some text' assertText ${row} 'Some other text' to check that 'Some text' and 'Some other text' occur in the same table row as 'case_1'. I haven't got this to work so far, and I'm not sure whether it is possible, or what syntax to use if it is. Has anybody managed to do this?

    Read the article

  • How to convet DataTable to List on runtype with out existin class property [closed]

    - by shamim
    Work on VS2010 C#,Have one DataTable ,want to convert this DataTable to List Suppose: Table dt; On run time want to create similar field from a datatable and fill fields in List.There is no existing class for list properties. ListName=TableName List property name=Table column name List Property type=Table column type List items=Table rows Note: Recently work on EF.To fullfill my project requirement, need to give flexibility to use to input and execute ESQL at runtime .I don’t want to put this execute result on datatable or List ,want to put this result on list. List has no existing class and property,don’t want to convert DataTable on list Type:DataRow If have any query please ask,Thanks in advanced.

    Read the article

  • Data manipulation without server side

    - by monczek
    Hi, I have to create a very simple webpage to show, filter and add data from not-yet-defined source (probably txt/xml/cvs). Records should be visible as a table, filtered using 3 criteria fields. There should be also possibility to add new records. My first thought was: XHTML + jQuery + csv2table + PicNet Table Filter. It does exactly what I want except adding rows - that is saving changes in source file (probably due to security risk). My question is - is there any possibility to do it without involving server side like asp.net, jee, php, sql? Source file is located on the server. Thanks for your ideas :-)

    Read the article

  • WPF Binding to a viewmodel

    - by user832747
    I'm simply binding a WPF DataGridTextColumn with a binding to my grid rows. <DataGridTextColumn Header="Name" Binding="{Binding Name}" /> I've bound to my row view models. The Name property has a PRIVATE setter. public string Name { get { return _name; } private set { _name = value; } } Shouldn't the datagrid prevent me from accessing the private setter? The grid allows me to access it. I swear it never used to, unless I'm forgetting something?

    Read the article

  • Controlling Too Many listboxes ( C#)

    - by Ases
    I have almost 200 listboxes. I'm changing their visibility according to my variables from database. So I thought that, I prepared a arraylist. like this ListBox[] lbs = this.Controls.OfType<ListBox>().ToArray(); And used like this. for (int idx = 0; idx < Convert.ToInt32(ds.Tables[j].Rows[i][2])*12; idx++) lbs[idx].Visible = true; This codes are written to comboboxchange. Now everything is okay. But; example; first time I changed combobox 1-20 listboxes visible=true I changed combobox again not 1-20. 20,40 changing :S How can it be, can u tell me an alternative array list type, or another way?

    Read the article

  • MySQL Insert Query Randomly Takes a Long Time

    - by ShimmerTroll
    I am using MySQL to manage session data for my PHP application. When testing the app, it is usually very quick and responsive. However, seemingly randomly the response will stall before finally completing after a few seconds. I have narrowed the problem down to the session write query which looks something like this: INSERT INTO Session VALUES('lvg0p9peb1vd55tue9nvh460a7', '1275704013', '') ON DUPLICATE KEY UPDATE sessAccess='1275704013',sessData=''; The slow query log has this information: Query_time: 0.524446 Lock_time: 0.000046 Rows_sent: 0 Rows_examined: 0 This happens about 1 out of every 10 times. The query usually only takes ~0.0044 sec. The table is InnoDB with about 60 rows. sessId is the primary key with a BTREE index. Since this is accessed on every page view, it is clearly not an acceptable execution time. Why is this happening? Update: Table schema is: sessId:varchar(32), sessAccess:int(10), sessData:text

    Read the article

  • Strange postgresql behavior

    - by Sergey
    Hi can someone explain me why it works like this? = select client_id from clients_to_delete; ERROR: column "client_id" does not exist at character 8 but, when putting this inside an IN()... = select * from orders where client_id in(select client_id from clients_to_delete); it works! and select all rows in the orders table. Same when running delete/update. Why it doesn't produce an error like before? Thank you!

    Read the article

  • Database table schema design - varchar(n). Suitable choice of N

    - by morpheous
    Coming from a C background, I may be getting too anal about this and worrying unnecessarily about bits and bytes here. Still, I cant help thinking how the data is actually stored and that if I choose an N which is easily factorizable into a power of 2, the database will be more effecient in how it packs data etc. Using this "logic", I have a string field in a table which is a variable length up to 21 chars. I am tempted to use 32 instead of 21, for the reason given above - however now I am thinking that I am wasting disk space because there will be space allocated for 11 extra chars that are guaranteed to be never used. Since I envisage storing several tens of thousands of rows a day, it all adds up. Question: Mindful of all of the above, Should I declare varchar(21) or varchar(32) and why?

    Read the article

  • Any idea why this query always returns duplicate items?

    - by Kardo
    I want to get all Images not used by current ItemID. The this subquery but it also always returns duplicate Images: EDITED select Images.ImageID, Images.ItemStatus, Images.UserName, Images.Url, Image_Item.ItemID, Image_Item.ItemID from Images left join (select ImageID, ItemID, MAX(DateCreated) x from Image_Item where ItemID != '5a0077fe-cf86-434d-9f3b-7ff3030a1b6e' group by ImageID, ItemID having count(*) = 1) image_item on Images.imageid = image_item.imageid where ItemID is not null I guess the problem is with the subquery which I can't avoid duplicate rows: select ImageID, ItemID, MAX(DateCreated) x from Image_Item where ItemID != '5a0077fe-cf86-434d-9f3b-7ff3030a1b6e' group by ImageID, ItemID having count(*) = 1 Result: F2EECBDC-963D-42A7-90B1-4F82F89A64C7 0578AC61-3C32-4A1D-812C-60A09A661E71 F2EECBDC-963D-42A7-90B1-4F82F89A64C7 9A4EC913-5AD6-4F9E-AF6D-CF4455D81C10 42BC8B1A-7430-4915-9CDA-C907CBC76D6A CB298EB9-A105-4797-985E-A370013B684F 16371C34-B861-477C-9A7C-DEB27C8F333D 44E6349B-7EBF-4C7E-B3B0-1C6E2F19992C Table: Images ImageID uniqueidentifier UserName nvarchar(100) DateCreated smalldatetime Url nvarchar(250) ItemStatus char(1) Table: Image_Item ImageID uniqueidentifier ItemID uniqueidentifier UserName nvarchar(100) ItemStatus char(1) DateCreated smalldatetime Any kind help is highly appreciated.

    Read the article

  • linq2sql left join with "multiselect"

    - by just_azho
    Hi, folks I'm trying to achieve following by linq2sql, but not successful. I've Member and Reference tables. DB is design in such a manner that Member can have multiple (=0) References. What I want as a result of query is, list (rows) of members, where all references of the member are "collected" in one column. What I had achieved is following query, but for this one there exist a row for each Reference. var refs = (from m in db.Members join r in db.References on m.PID equals r.PID into g from o in g.DefaultIfEmpty() select new { member = m, name = (o == null ? "" : o.NameSurname) }); I feel I need to insert SelectMany somewher :) Could you please give hints on achieving the goal?

    Read the article

  • Do partitions allow multiple bulk loads?

    - by ck
    I have a database that contains data for many "clients". Currently, we insert tens of thousands of rows into multiple tables every so often using .Net SqlBulkCopy which causes the entire tables to be locked and inaccessible for the duration of the transaction. As most of our business processes rely upon accessing data for only one client at a time, we would like to be able to load data for one client, while updating data for another client. To make things more fun, all PKs, FKs and clustered indexes are on GUID columns (I am looking at changing this). I'm looking at adding the ClientID into all tables, then partitioning on this. Would this give me the functionality I require?

    Read the article

  • How to count number of occurences for all different values in database column?

    - by drasto
    I have a Postgre database that has say 10 columns. The fifth column is called column5. There are 100 rows in the database and possible values of column5 are c5value1, c5value2, c5value3...c5value29, c5value30. I would like to print out a table that shows how many times each value occurs. So the table would look like this: Value(of column5) number of occurrences of the value c5value1 1 c5value2 5 c5value3 3 c5value4 9 c5value5 1 c5value6 1 . . . . . . What is the command that does that? Thanks for help

    Read the article

< Previous Page | 215 216 217 218 219 220 221 222 223 224 225 226  | Next Page >