Search Results

Search found 62705 results on 2509 pages for 'sql calc found rows'.

Page 135/2509 | < Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >

  • Best way to stop SQL Injection in PHP

    - by Andrew G. Johnson
    So specifically in a mysql database. Take the following code and tell me what to do. // connect to the mysql database $unsafe_variable = $_POST["user-input"]; mysql_query("INSERT INTO table (column) VALUES ('" . $unsafe_variable . "')"); // disconnect from the mysql database

    Read the article

  • SSIS - Skip Missing Files

    - by Greg
    I have a SSIS 2008 package that calls about 10 other SSIS packages (legacy issues, don't ask). Each of those child packages loads a specific file into a table. But sometimes one or more of these input files will be missing. How can I let a child package fail (because a file is missing) but let the rest of the parent package keep on running? I've tried increasing the maximum error count on the parent package, the tasks in the parent package that call each child, and in the child package itself. None of that seemed to make any difference. I still get this error when I run it with a file missing: SSIS Warning Code DTS_W_MAXIMUMERRORCOUNTREACHED. The Execution method succeeded, but the number of errors raised (2) reached the maximum allowed (1); resulting in failure. This occurs when the number of errors reaches the number specified in MaximumErrorCount. Change the MaximumErrorCount or fix the errors. Edit: failpackageonfailure and faulparentonfailure are already all set to false everywhere.

    Read the article

  • SQL Server - In clause with a declared variable

    - by Melursus
    Let say I got the following : DECLARE @ExcludedList VARCHAR(MAX) SET @ExcludedList = 3 + ', ' + 4 + ' ,' + '22' SELECT * FROM A WHERE Id NOT IN (@ExcludedList) Error : Conversion failed when converting the varchar value ', ' to data type int. I understand why the error is there but I don't know how to solve it...

    Read the article

  • how to use a parameterized function for the Default Binding of a Sql Server column

    - by Walt Gaber
    I have a table that catalogs selected files from multiple sources. I want to record whether a file is a duplicate of a previously cataloged file at the time the new file is cataloged. I have a column in my table (“primary_duplicate”) to record each entry as ‘P’ (primary) or ‘D’ (duplicate). I would like to provide a Default Binding for this column that would check for other occurrences of this file (i.e. name, length, timestamp) at the time the new file is being recorded. I have created a function that performs this check (see “GetPrimaryDuplicate” below). But I don’t know how to bind this function which requires three parameters to the table’s “primary_duplicate” column as its Default Binding. I would like to avoid using a trigger. I currently have a stored procedure used to insert new records that performs this check. But I would like to ensure that the flag is set correctly if an insert is performed outside of this stored procedure. How can I call this function with values from the row that is being inserted? USE [MyDatabase] GO SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[FileCatalog]( [id] [uniqueidentifier] NOT NULL, [catalog_timestamp] [datetime] NOT NULL, [primary_duplicate] nchar NOT NULL, [name] nvarchar NULL, [length] [bigint] NULL, [timestamp] [datetime] NULL ) ON [PRIMARY] GO ALTER TABLE [dbo].[FileCatalog] ADD CONSTRAINT [DF_FileCatalog_id] DEFAULT (newid()) FOR [id] GO ALTER TABLE [dbo].[FileCatalog] ADD CONSTRAINT [DF_FileCatalog_catalog_timestamp] DEFAULT (getdate()) FOR [catalog_timestamp] GO ALTER TABLE [dbo].[FileCatalog] ADD CONSTRAINT [DF_FileCatalog_primary_duplicate] DEFAULT (N'GetPrimaryDuplicate(name, length, timestamp)') FOR [primary_duplicate] GO USE [MyDatabase] GO SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE FUNCTION [dbo].[GetPrimaryDuplicate] ( @name nvarchar(255), @length bigint, @timestamp datetime ) RETURNS nchar(1) AS BEGIN DECLARE @c int SELECT @c = COUNT(*) FROM FileCatalog WHERE name=@name and length=@length and timestamp=@timestamp and primary_duplicate = 'P' IF @c > 0 RETURN 'D' -- Duplicate RETURN 'P' -- Primary END GO

    Read the article

  • DB2: Won't allow parameterize fetch first X rows only

    - by Guy Roth
    Although in Oracle DB its is allowed to parametrize the number of rows that the query can fetch by adding to the query: select ... from ... where ... and rownum <= @MaximumRecords I can't add similar condition to acuivalent query running in DB2: It is allowed to add: select ... from ... where ... fetch first 500 rows only (where there is fixed number of rows) but not: select ... from ... where ... fetch first :1 rows only (:1 == @MaximumRecords) Is someone aware of a solution/work-around to this problem?

    Read the article

  • How to create relationship between two tables with revisions using Entity Framework

    - by Chris Ridenour
    So I am in the process of redesigning a small database (and potentially a much larger one) but want to show the value of using revisions / history of the business objects. I am switching the data from Access to MSSQL 2008. I am having a lot of internal debate on what version of "revision history" to use in the design itself - and thought I had decided to add a "RevisionId" to all tables. With this design - adding a RevisionId to all tables we would like tracked - what would be the best way to create Navigational Properties and Relationships between two tables such as | Vendor | VendorContact | where a Vendor can have multiple contacts. The Contacts themselves will be under revision. Will it require custom extensions or am I over thinking this? Thanks in advance.

    Read the article

  • Data Access from single table in sql server 2005 is too slow

    - by Muhammad Kashif Nadeem
    Following is the script of table. Accessing data from this table is too slow. SET ANSI_NULLS ON GO SET QUOTED_IDENTIFIER ON GO CREATE TABLE [dbo].[Emails]( [id] [int] IDENTITY(1,1) NOT NULL, [datecreated] [datetime] NULL CONSTRAINT [DF_Emails_datecreated] DEFAULT (getdate()), [UID] [nvarchar](250) COLLATE Latin1_General_CI_AS NULL, [From] [nvarchar](100) COLLATE Latin1_General_CI_AS NULL, [To] [nvarchar](100) COLLATE Latin1_General_CI_AS NULL, [Subject] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [Body] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [HTML] [nvarchar](max) COLLATE Latin1_General_CI_AS NULL, [AttachmentCount] [int] NULL, [Dated] [datetime] NULL ) ON [PRIMARY] Following query takes 50 seconds to fetch data. select id, datecreated, UID, [From], [To], Subject, AttachmentCount, Dated from emails If I include Body and Html in select then time is event worse. indexes are on: id unique clustered From Non unique non clustered To Non unique non clustered Tabls has currently 180000+ records. There might be 100,000 records each month so this will become more slow as time will pass. Does splitting data into two table will solve the problem? What other indexes should be there?

    Read the article

  • SQL column length query

    - by Mike
    i've been using the following query: select LEN(columnname) as columnmame from dbo.amu_datastaging This works, but is there a way to only return the greatest value instead of all the values? So if i return 1million records and the longest length is 400, the query would just return the value of 400?

    Read the article

  • SQL Server Concatenate string column value to 5 char long

    - by mrp
    Scenario: I have a table1(col1 char(5)); A value in table1 may '001' or '01' or '1'. Requirement: Whatever value in col1, I need to retrive it in 5 char length concatenate with leading '0' to make it 5 char long. Technique I applied: select right(('00000' + col1),5) from table1; I didn't see any reason, why it doesn't work? but it didn't. Can anyone help me, how I can achieve the desired result?

    Read the article

  • Most optimal way to convert to date

    - by IMHO
    I have legacy system where all date fields are maintained in YMD format. Example: 20101123 this is date: 11/23/2010 I'm looking for most optimal way to convert from number to date field. Here is what I came up with: declare @ymd int set @ymd = 20101122 select @ymd, convert(datetime, cast(@ymd as varchar(100)), 112) This is pretty good solution but I'm wandering if someone has better way doing it

    Read the article

  • SQL Server Concatinate string column value to 5 char long

    - by mrp
    Scenario: I have a table1(col1 char(5)); A value in table1 may '001' or '01' or '1'. Requirement: Whatever value in col1, I need to retrive it in 5 char length concatenate with leading '0' to make it 5 char long. Technique I applied: select right(('00000' + col1),5) from table1; I didn't see any reason, why it doesn't work? but it didn't. Can anyone help me, how I can achieve the desired result?

    Read the article

  • Picking Random Names

    - by Jasl
    I saw an interesting post sometime back but with no solution. Trying luck here: There is a table which contain 10 names (U1, U2, U3..and so on). I have to choose 5 names everyday, and display one as the Editor and 4 as Contributors While selecting the random names, I have to also consider that if one user is selected as Editor, he cannot become editor again till everyone got their chance. The output should look similar to the following: Editor Cont1 Cont2 Cont3 Cont4 20-Jun U1 U8 U9 U3 U4 21-Jun U7 U2 U5 U6 U10 22-Jun U3 U4 U9 U2 U8 23-Jun U4 U8 U3 U5 U2 and so on..

    Read the article

  • SQL Server unique constraint problem

    - by b0x0rz
    How to create a unique constraint on a varchar(max) field in visual studio, visually. the problem is when i try it: manage indexes and keys add columns I can only chose the bigint columns, but not any of the varchar(max) ones. Do I maybe have to use check constraints? If yes, what to put in the expression? Thnx for the info

    Read the article

  • constructing dynamic In Statements with sql

    - by nitroxn
    Suppose we need to check three boolean conditions to perform a select query. Let the three flags be 'A', 'B' and 'C'. If all of the three flags are set to '1' then the query to be generated is SELECT * FROM Food WHERE Name In ('Apple, 'Biscuit', 'Chocolate'); If only the flags 'A' and 'B' are set to '1' with C set to '0'. Then the following query is generated. SELECT * FROM Food WHERE Name In ('Apple, 'Biscuit'); What is the best way to do it?

    Read the article

  • T-SQL query and group by reporting help

    - by Dayton Brown
    So I have some data that looks like this. `USERID1 USERID2` 1 10 2 20 2 30 3 40 3 50 1 10 2 20 2 30 3 50 I want a query that produces the following `USERID1 COUNT` 2 2 3 2 It's a group by query that shows me the count of unique USERID2 for each USERID1 that has more than 1 USERID2 associated with it. God I hope you aren't as confused as I am by that last statement.

    Read the article

  • "lock request time out period exceeded" Error When Trying to See DB Hierarchies

    - by Lloyd Banks
    I have a DB that I can run basic queries (albeit much slower than normal) off of. When I try to see the hierarchy trees for tables, views, or procedures in SSMS Object Explorer, I get the "lock request time out period exceeded". My Report Server reports that run off of objects in this DB are no longer completing. Jobs associated with procedures stored on this DB also do not run. I tried using sp_who2 to find and kill all connections on the DB. This has not solved the problem. What is going on here? How can I resolve this?

    Read the article

  • INSERT statement conflicted with the FOREIGN KEY constraint

    - by SmartestVEGA
    I am getting the following error. Could you please help me in ? Msg 547, Level 16, State 0, Line 1 The INSERT statement conflicted with the FOREIGN KEY constraint "FK_Sup_Item_Sup_Item_Cat". The conflict occurred in database "dev_bo", table "dbo.Sup_Item_Cat". The statement has been terminated. insert into sup_item (supplier_id,sup_item_id,name,sup_item_cat_id,status_code,last_modified_user_id,last_modified_timestamp,client_id) values(10162425,10,'jaiso','123123','a','12','2010-12-12','1062425') the last coulum "client_id" i am getting the conflict. I tried to put the value which already exists in the dbo.Sup_Item_Cat to the column corresponding to the sup_item.. but no joy :-(

    Read the article

  • regarding like query operator

    - by stackoverflowuser
    Hi For the below data (well..there are many more nodes in the team foundation server table which i need to refer to..below is just a sample) Nodes ------------------------ \node1\node2\node3\ \node1\node2\node5\ \node1\node2\node3\node4 I was wondering if i can apply something like (below query does not give the required results) select * from table_a where nodes like '\node1\node2\%\' to get the below data \node1\node2\node3\ \node1\node2\node5\ and something like (below does not give the required results) select * from table_a where nodes like '\node1\node2\%\%\' to get \node1\node2\node3\ \node1\node2\node5\ \node1\node2\node3\node4 Can the above be done with like operator? Pls. suggest. Thanks

    Read the article

  • Is it OK to re-create many SQL connections (SQL 2008)

    - by Mr. Flibble
    When performing many inserts into a database I would usually have code like this: using (var connection = new SqlConnection(connStr)) { connection.Open(); foreach (var item in items) { var cmd = new SqlCommand("INSERT ...") cmd.ExecuteNonQuery(); } } I now want to shard the database and therefore need to choose the connection string based on the item being inserted. This would make my code run more like this foreach (var item in items) { connStr = GetConnectionString(item); using (var connection = new SqlConnection(connStr)) { connection.Open(); var cmd = new SqlCommand("INSERT ...") cmd.ExecuteNonQuery(); } } Which basically means it's creating a new connection to the database for each item. Will this work or will recreating connections for each insert cause terrible overhead?

    Read the article

  • SQl Server: serializable level not working

    - by Zé Carlos
    I have the following SP: CREATE PROCEDURE [dbo].[sp_LockReader] AS BEGIN SET NOCOUNT ON; begin try set transaction isolation level serializable begin tran select * from teste commit tran end try begin catch rollback tran set transaction isolation level READ COMMITTED end catch set transaction isolation level READ COMMITTED END The table "test" has many values, so "select * from teste" takes several seconds. I run the sp_LockReader at same time in two diferent query windows and the second one starts showing test table contents without the first one terminates. Shouldn't serializeble level forces the second query to wait? How do i get the described behaviour? Thanks

    Read the article

  • Using "CASE" in Where clause to choose various column harm the performance

    - by zivgabo
    I have query which needs to be dynamic on some of the columns, meaning I get a parameter and according its value I decide which column to fetch in my Where clause. I've implemented this request using "CASE" expression: (CASE @isArrivalTime WHEN 1 THEN ArrivalTime ELSE PickedupTime END) >= DATEADD(mi, -@TZOffsetInMins, @sTime) AND (CASE @isArrivalTime WHEN 1 THEN ArrivalTime ELSE PickedupTime END) < DATEADD(mi, -@TZOffsetInMins, @fTime) If @isArrivalTime = 1 then chose ArrivalTime column else chose PickedupTime column. I have a clustered index on ArrivalTime and nonclustered index on PickedupTime. I've noticed that when I'm using this query (with @isArrivalTime = 1), my performance is a lot worse comparing to only using ArrivalTime. Maybe the query optimizer can't use\choose the index properly in this way? I compared the execution plans an noticed that when I'm using the CASE 32% of the time is being wasted on the index scan, but when I didn't use the CASE(just usedArrivalTime`) only 3% were wasted on this index scan. Anyone know the reason for this?

    Read the article

  • Insert a Join Statement - (Insert Data to Multiple Tables) - C#/SQL/T-SQL/.NET

    - by peace
    I have a Winform that has fields need to be filled by a user. All the fields doesn't belong to one table, the data will go to Customer table and CustomerPhone table, so i decided to do multiple inserts. I will insert appropriate data to CustomerPhone first then Insert the rest data to Customer table. Is it possible to Join an Insert OR Insert a Join? If show me a rough sample, i will be grateful. Many Thanks

    Read the article

  • SQL Server unique contraint problem

    - by b0x0rz
    How to create a unique constraint on a varchar(max) field in visual studio, visually. the problem is when i try it: manage indexes and keys add columns i can only chose the bigint columns, but not any of the varchar(max) ones. do i maybe have to use check constraints? if yes, what to put in the expression? thnx for the info

    Read the article

  • Files and filegroups sql server 2005

    - by Dhivagar
    Can we move default file to another filegroup. sample code is given below Sample code create database EMPLOYEE ON PRIMARY ( NAME = 'PRIMARY_01', FILENAME = 'C:\METADATA\PRIM01.MDF', SIZE = 5 MB , MAXSIZE =50 MB, FILEGROWTH = 2 MB), ( NAME = 'SECONDARY_02', FILENAME = 'C:\METADATA\SEC02.NDF' ), FILEGROUP EMPLOYEE_dETAILS ( NAME = 'EMPDETILS_01', FILENAME = 'C:\METADATA\EMPDET01.NDF', SIZE = 5 MB , MAXSIZE =50 MB, FILEGROWTH = 2 MB), ( NAME = 'EMPDETILS_02', FILENAME = 'C:\METADATA\EMPDET02.NDF', SIZE = 5 MB , MAXSIZE =50 MB, FILEGROWTH = 2 MB) LOG ON ( NAME = 'TRANSACLOG', FILENAME ='c:\METADATA\TRAS01.LDF', SIZE = 5 MB , MAXSIZE =50 MB, FILEGROWTH = 2 MB ) now i want to move the FILENAME = 'C:\METADATA\SEC02.NDF' from deault primary file to the FILEGROUP EMPLOYEE_dETAILS ? need assist ??

    Read the article

< Previous Page | 131 132 133 134 135 136 137 138 139 140 141 142  | Next Page >