Search Results

Search found 33316 results on 1333 pages for 'sql team'.

Page 568/1333 | < Previous Page | 564 565 566 567 568 569 570 571 572 573 574 575  | Next Page >

  • free public databases with non-trivial table structures?

    - by Caffeine Coma
    I'm looking for some sample database data that I can use for testing and demonstrating a DB tool I am working on. I need a DB that has (preferably) many tables, and many foreign key relationships between the tables. Ideally the data would be in SQL dump format, or at least in something that maintains the foreign key references, and could be easily import into an RDBMS (MySQL or H2). The dataset itself doesn't have to be huge (in fact, best if it's not). I thought about using the Stackoverflow Data Dump, but it's only about 5 tables.

    Read the article

  • Rails problem find by sql

    - by Totty
    I have this query and I have an error: images = Image.find_by_sql('PREPARE stmt FROM \' SELECT * FROM images AS i WHERE i.on_id = 1 AND i.on_type = "profile" ORDER BY i.updated_at LIMIT ?, 6\ '; SET @lower_limit := ((5 DIV 6) * 6); EXECUTE stmt USING @lower_limit;') Mysql::Error: You have an error in your SQL syntax; check the manual that corresponds to your MySQL server version for the right syntax to use near 'SET @lower_limit := ((5 DIV 6) * 6); EXECUTE stmt USING @lower_limit' at line 1: PREPARE stmt FROM ' SELECT * FROM images AS i WHERE i.on_id = 1 AND i.on_type = "profile" ORDER BY i.updated_at LIMIT ?, 6'; SET @lower_limit := ((5 DIV 6) * 6); EXECUTE stmt USING @lower_limit;

    Read the article

  • Retrieve names by ratio of their occurance

    - by jjiffer
    Hello, I'm somewhat new to SQL queries, and I'm struggling with this particular problem. Let's say I have query that returns the following 3 records (kept to one column for simplicity): Tom Jack Tom And I want to have those results grouped by the name and also include the fraction (ratio) of the occurrence of that name out of the total records returned. So, the desired result would be (as two columns): Tom | 2/3 Jack | 1/3 How would I go about it? Determining the numerator is pretty easy (I can just use COUNT() and GROUP BY name), but I'm having trouble translating that into a ratio out of the total rows returned. Any help is much appreciated!

    Read the article

  • Management Studio default file save location

    - by jayrdub
    Open a new query window. Write some SQL. Save the script, the Save File As dialog box opens - but always to the same default location in the Profiles directory. Is there any way to set my default file location? ...Like I used to do with apps from the 1980s? Under Tools|Options a default location can be specified for query results. I need the same thing for new queries (the text editor). Tried changing locations in the Registry but SSMS just overwrote my changes. Any suggestions? (I saw this unanswered question at http://www.eggheadcafe.com/software/aspnet/30098335/management-studio-default.aspx and I had same exact question so I just reposted it here)

    Read the article

  • Why isn't DBIx::Class::Schema::Loader creating my classes?

    - by Robert Wohlfarth
    I am trying to generate static schemas using DBIx::Class in Perl. The command shown below outputs a Schema.pm and no other files. Any idea what I'm doing wrong, or how to to debug this? U:\wohlfarj\Software\PARS>perl -MDBIx::Class::Schema::Loader=make_schema_at,dump_to_dir:.\lib -e "make_schema_at('PARS::Schema',{debug=>1},['dbi:ODBC:PARS','user','password',{AutoCommit=>0}])" Dumping manual schema for PARS::Schema to directory .\lib ... Schema dump completed. I'm using Strawberry Perl on Windows XP. The database is SQL Server 2000, accessed through an ODBC connection. I can successfully run queries using plain old DBI with the same ODBC connection.

    Read the article

  • Concurency issues with scheduling app

    - by Sazug
    Our application needs a simple scheduling mechanism - we can schedule only one visit per room for the same time interval (but one visit can be using one or more rooms). Using SQL Server 2005, sample procedure could look like this: CREATE PROCEDURE CreateVisit @start datetime, @end datetime, @roomID int AS BEGIN DECLARE @isFreeRoom INT BEGIN TRANSACTION SELECT @isFreeRoom = COUNT(*) FROM visits V INNER JOIN visits_rooms VR on VR.VisitID = V.ID WHERE @start = start AND @end = [end] AND VR.RoomID = @roomID IF (@isFreeRoom = 0) BEGIN INSERT INTO visits (start, [end]) VALUES (@start, @end) INSERT INTO visits_rooms (visitID, roomID) VALUES (SCOPE_IDENTITY(), @roomID) END COMMIT TRANSACTION END In order to not have the same room scheduled for two visits at the same time, how should we handle this problem in procedure? Should we use SERIALIZABLE transaction isolation level or maybe use table hints (locks)? Which one is better?

    Read the article

  • how to read the txt file from database(byte[] to filestream)

    - by Ranjana
    i have stored the txt file to sql server database . i need to read the txt file line by line to get the content in it. my code : DataTable dtDeleteFolderFile = new DataTable(); dtDeleteFolderFile = objutility.GetData("GetTxtFileonFileName", new object[] { ddlSelectFile.SelectedItem.Text }).Tables[0]; foreach (DataRow dr in dtDeleteFolderFile.Rows) { name = dr["FileName"].ToString(); records = Convert.ToInt32(dr["NoOfRecords"].ToString()); bytes = (Byte[])dr["Data"]; } FileStream readfile = new FileStream(Server.MapPath("txtfiles/" + name), FileMode.Open); StreamReader streamreader = new StreamReader(readfile); string line = ""; line = streamreader.ReadLine(); but here i have used the FileStream to read from the Particular path. but i have saved the txt file in byt format into my Database. how to read the txt file using the byte[] value to get the txt file content, instead of using the Path value.

    Read the article

  • Troubleshooting MSSQL Connection from PHP

    - by Cory Dee
    I'm trying to connect to an external Sql Server through PHP 5.2. Using this line: $con = mssql_connect('123.123.123.123','Username','Password') or die('Could not connect to the server!'); I'm receiving this error: Warning: mssql_connect() [function.mssql-connect]: Unable to connect to server: 123.123.123.123 in /home/file/public_html/structure/index.php on line 4 Could not connect to the server! My hosting provider assures me that ports are open for my server to connect to the DB. Looking at my php info, MSSQL Support is enabled, using FreeTDS. Any ideas why this would be failing, or how I can begin trouble shooting the problem?

    Read the article

  • Speeding up inner joins between a large table and a small table

    - by Zaid
    This may be a silly question, but it may shed some light on how joins work internally. Let's say I have a large table L and a small table S (100K rows vs. 100 rows). Would there be any difference in terms of speed between the following two options?: OPTION 1: OPTION 2: --------- --------- SELECT * SELECT * FROM L INNER JOIN S FROM S INNER JOIN L ON L.id = S.id; ON L.id = S.id; Notice that the only difference is the order in which the tables are joined. I realize performance may vary between different SQL languages. If so, how would MySQL compare to Access?

    Read the article

  • Union on ValuesQuerySet in django

    - by Wuxab
    I've been searching for a way to take the union of querysets in django. From what I read you can use query1 | query2 to take the union... This doesn't seem to work when using values() though. I'd skip using values until after taking the union but I need to use annotate to take the sum of a field and filter on it and since there's no way to do "group by" I have to use values(). The other suggestions I read were to use Q objects but I can't think of a way that would work. Do I pretty much need to just use straight SQL or is there a django way of doing this? What I want is: q1 = mymodel.objects.filter(date__lt = '2010-06-11').values('field1','field2').annotate(volsum=Sum('volume')).exclude(volsum=0) q2 = mymodel.objects.values('field1','field2').annotate(volsum=Sum('volume')).exclude(volsum=0) query = q1|q2 But this doesn't work and as far as I know I need the "values" part because there's no other way for Sum to know how to act since it's a 15 column table.

    Read the article

  • How to order the items in a nested LINQ-provided collection

    - by Carson McComas
    I've got a (SQL Server) database table called Category. And another database table called SubCategory. SubCategory has a foreign key relationship to Category. Because of this, thanks to LINQ, each Cateogory has a property called SubCategories and LINQ is nice enough to return all the SubCategories associated with my Category when I grab it. If I want to sort the Categories alphabetically, I can just do: return db.Categories.OrderBy(c => c.Name); However, I have no idea how to order the SubCategories collection inside each Category. My goal is to return a collection of Categories, where all of the SubCategory collections inside of them are ordered alphabetically by Name.

    Read the article

  • Losing DateTimeOffset precision when using C#

    - by Darvis Lombardo
    I have a SQL Server table with a CreatedDate field of type DateTimeOffset(2). A sample value which is in the table is 2010-03-01 15:18:58.57 -05:00 As an example, from within C# I retrieve this value like so: var cmd = new SqlCommand("SELECT CreatedDate FROM Entities WHERE EntityID = 2", cn); var da = new SqlDataAdapter(cmd); DataTable dt =new DataTable(); da.Fill(dt); And I look at the value: MessageBox.Show(dt.Rows[0][0].ToString()); The result is 2010-03-01 15:18:58 -05:00, which is missing the .57 that is stored in the database. If I look at dt.Rows[0][0] in the Watch window, I also do not see the .57, so it appears it has been truncated. Can someone shed some light on this? I need to use the date to match up with other records in the database and the .57 is needed. Thanks! Darvis

    Read the article

  • Best practices for extending third party databases?

    - by Eric Watkins
    I have a situation where our developers extended a Third party database (MS SQL) by adding tables, views, stored procedures, and functions. Recently when the vender issued updates to the database they dropped all of our custom objects. The question now is what are some best practices that will allow us to extend the third party database but keep our objects safe from future updates? My first thought is to create a separate database but then I’m stuck with fully qualifying all the references back to the original database which may cause issues promoting database changes from test to production.

    Read the article

  • How to convert a MSSQL database (including procedures, functions and triggers) to a firebird databas

    - by user193655
    I am considering migrating to Firebird. To have a "quick start" approach I downloaded the trial of a conversion tool (DBConvert) and tried it. I just picked up a random tool, this tool doesn't convert procedures, functions and triggers (I don't think it is a limit of the trial since there is not an explicit reference to sp, sf and triggers in the link above). Anyway by trying that tool I had the message: "The DB cannot be converted succesfully because some FK names are too long." This is because in some tables I have FK whose description is 32 chars. Is this a real firebird limit or it is possible to overcome it somehow (of course renaming the FK is an extreme option because it is extra work)? Anyway how to convert a MS SQL DB fully to FIREBIRD? Is there a valid tool? Did someone succed in a conversion of non trivial databases?

    Read the article

  • How to maintain unique login in windows form application?

    - by Vivek
    Hello All, I am developing a winform application in which user's login is validated through ms sql server 2000 database.When user entry its user name and password, application checks its exists in user table or not. Now my requirement is if a user already login through one system it should not log in through another system. if solution like make entry in database about status of user like on successful log in mark user status is true and on closing application mark false , then in the case of network or hardware failure or system exception. so, please suggest me a optimal solution .

    Read the article

  • Authenticating mssql users through different logins

    - by Sebastian Ferreyra
    I'm developing a new node.js based frontend for an old intranet site using mssql server and Classic ASP. Both the new and the old site will coexist during the transition phase and both must access the same mssql server using the same database principals. The old ASP/IIS site uses integrated windows login. The new node-based frontend has to use sql server based logins/principals (edit: while the IIS/ASP site must keep using Windows server principals) and all existing database users/principals must keep working wherever they login from. From reading around, it appears that a database principal can not be assigned to multiple server principals, therefore it seems unlikely there's a simple solution to this. I'd like to know if anybody has had to deal with a similar situation before and how they've gone about it.

    Read the article

  • Compare rows between 2 tables

    - by arthur
    I am new to SQL and I need to build a database for a grocery store(not real, just a course assignment) i have two fields from two different tables - supplied price - the price that the store buys from the supplier and price that is given to the customers How can I make a constraint that insures that supplied price is lower then the price that is given to the customers? The relevant tables that I have are: CREATE TABLE Supplied_Products( [Supplier ID] Int NOT NULL Foreign Key References Suppliers, [Product ID] Int NOT NULL Foreign Key References Products, Price Float NOT NULL, CHECK (Price0), Constraint PK_Supplied_Products PRIMARY KEY([Supplier ID] ,[Product ID]) ) CREATE TABLE Products( [Product-ID] Int NOT NULL PRIMARY KEY, [Product Name] Varchar(20) NOT NULL, Price Float NOT NULL, [Category-Name] Varchar(20) NOT NULL Foreign Key References Categories, [Weight] Float NOT NULL, [Is Refrigirated] Varchar(1) DEFAULT 'N' CHECK ([Is Refrigirated] in('Y','N')),/* Is Refrigirated can be only Y-yes or N-no*/ CHECK (Price 0) )

    Read the article

  • Is it OK to pass SQLCommand as a parameter?

    - by TooFat
    I have a Business Layer that passes a Conn string and a SQLCommand to a Data Layer like so public void PopulateLocalData() { System.Data.SqlClient.SqlCommand cmd = new System.Data.SqlClient.SqlCommand(); cmd.CommandType = System.Data.CommandType.StoredProcedure; cmd.CommandText = "usp_PopulateServiceSurveyLocal"; DataLayer.DataProvider.ExecSQL(ConnString, cmd); } The DataLayer then just executes the sql like so public static int ExecSQL(string sqlConnString, System.Data.SqlClient.SqlCommand cmd) { int rowsAffected; using (SqlConnection conn = new SqlConnection(sqlConnString)) { conn.Open(); cmd.Connection = conn; rowsAffected = cmd.ExecuteNonQuery(); cmd.Dispose(); } return rowsAffected; } Is it OK for me to pass the SQLCommand as a parameter like this or is there a better more accepted way of doing it. One of my concerns is if an error occurs when executing the query the cmd.dispose line will never execute. Does that mean it will continue to use up memory that will never be released?

    Read the article

  • SSIS - Can I get the column schema for a flat file source from a database?

    - by Steve Clement
    We receive a nightly data export from a vendor in the form of about 10 tab-delimited flat file without column headers. In addition, the vendor provides us with the SQL scripts for the database tables so that we can import the files into our system. Unfortunately, the vendor recently changed the schema for the flat files. Each file has upwards 150 columns, and having to go through the DB schema and adjust column types on a Flat File Data Source in SSIS is extremely time consuming, not to mention a royal pain. Since I know the file data layout in the database schema, is there any way I can dynamically pull that into a Flat File source to set the columns correctly? Or am I just stuck with manually setting everything?

    Read the article

  • Trigger to update data in another DB

    - by Permana
    I have the following schema: Database: test. Table: per_login_user, Field: username (PK), password Database: wavinet. Table: login_user, Field: username (PK), password What I want to do is to create a trigger. Whenever a password field on table per_login_user in database test get updated, the same value will be copied to field password in Table login_wavinet in database wavinet I have search trough Google and find this solution: http://forums.devshed.com/ms-sql-development-95/use-trigger-to-update-data-in-another-db-149985.html But, when I run this query: CREATE TRIGGER trgPasswordUpdater ON dbo.per_login_user FOR UPDATE AS UPDATE wavinet.dbo.login_user SET password = I.password FROM inserted I INNER JOIN deleted D ON I.username = D.username WHERE wavinet.dbo.login_wavinet.password = D.password the query return error message: Msg 107, Level 16, State 3, Procedure trgPasswordUpdater, Line 4 The column prefix 'wavinet.dbo.login_wavinet' does not match with a table name or alias name used in the query.

    Read the article

  • C# Finisar SQLite Date Format Problem

    - by Emanuel
    My "task" database table look like this: [title] [content] [start_date] [end_date] [...] [...] [01.06.2010 20:10:36] [06.06.2010 20:10:36] [...] [...] [05.06.2010 20:10:36] [06.06.2010 20:10:36] And I want to find only those records that meet the condition that a given day is between start_date and end_date. I've tried the following SQL expression: SELECT * FROM task WHERE strftime ('%d', 'start_date') <= @day AND @day <= strftime ('%d', 'end_date') Where @day is an SQLiteParameter (eq 5). But no result is returned. How can I solve this problem? Thanks.

    Read the article

  • How Do I Search Between a Date Rang Using the ActiveRecord Model?

    - by Russ Bradberry
    I am new to both Ruby and ActiveRecord. I currently have a need to modify and existing piece of code to add a date range in the select. The current piece goes like this: ReportsThirdparty.find(:all, :conditions => {:site_id=>site_id, :campaign_id=>campaign_id, :size_id=>size_id}) Now, I need to add a range, but I am not sure how to do the BETWEEN or >= or <= operators. I guess what I need is something similar to: ReportsThirdparty.find(:all, :conditions => {:site_id=>site_id, :campaign_id=>campaign_id, :size_id=>size_id, :row_date=>"BETWEEN #{start_date} AND #{end_date}") Even if this did work, I know that using interpolation here would leave me subject to SQL injection attacks.

    Read the article

  • How do I select differing rows in two MySQL tables with the same structure?

    - by chiborg
    I have two tables, A and B, that have the same structure (about 30+ fields). Is there a short, elegant way to join these tables and only select rows where one or more columns differ? I could certainly write some script that creates the query with all the column names but maybe there is an SQL-only solution. To put it another way: Is there a short substitute to this: SELECT * FROM table_a a JOIN table_b b ON a.pkey=b.pkey WHERE a.col1 != b.col2 OR a.col2 != b.col2 OR a.col3 != b.col3 # .. repeat for 30 columns

    Read the article

  • Check if checkbox is checked or not (ASPX)

    - by cthulhu
    I have the following code: (some.aspx.cs) if(Page.IsPostBack) { bool apple2 = false; bool pizza2 = false; bool orange2 = false; if (apple.Checked) apple2 = true; if (pizza.Checked) pizza2 = true; if (orange.Checked) orange2 = true; } (some.aspx) <tr> <td>Food:</td> <td>Apple <input type="checkbox" name="food" id="apple" value="apple" runat="server" />Pizza <input type="checkbox" name="food" id="pizza" value="pizza" runat="server" />Orange <input type="checkbox" name="food" id="orange" value="orange" runat="server" /></td> Now, i send the Boolean variables to SQL database. The problem is only with unchecked boxes. I mean, when you check some checkboxes it sends it as true (and that's right) but when i uncheck them it remains the same (true).

    Read the article

  • Determining Connections between data in a single table

    - by user1689749
    Hi I'm a BA / programmer type doing data analysis on a legacy system. I've been teaching myself SQL to help, but I've appeared to hit upon a problem bigger than my abilities. I have two tables (generalized for simplicity): Table Objects Object_PK Table Components Component_PK Object_FK Component_Type There are 100+ distinct values in Component_Type_Code. Given that any object can have N number of Components, how can I see which Component_Type(s) appear with other Component_Type(s)? For example, the following query tells me what component_types appear with the component_type 'Component_type_1': select component_type_code, count(*) from components where object_fk in ( select object_fk from components where component_type_code = 'component_type_1' ) group by component_type_code I'd like to get a query to show me all connections My apologies for the formatting. Any help is appreciated. I've looked at cube and rollup, but didn't know how to apply to this situtation.

    Read the article

< Previous Page | 564 565 566 567 568 569 570 571 572 573 574 575  | Next Page >