Search Results

Search found 37647 results on 1506 pages for 'sql performance'.

Page 506/1506 | < Previous Page | 502 503 504 505 506 507 508 509 510 511 512 513  | Next Page >

  • How to load the SQL data into several ComboBox easily, am i doing the correctly or is there another way

    - by Dominic Deepan.d
    I have a Combobox to fill the data for City, State and PinCode these combobox is dopdown list and the user will pick it. and it loads once the form opens. Here is the CODE: /// CODE TO BRING A DATA FROM SQL INTO THE FORM DROP LIST /// To fill the sates from States Table cn = new SqlConnection(@"Data Source=Nick-PC\SQLEXPRESS;Initial Catalog=AutoDB;Integrated Security=True"); cmd= new SqlCommand("select * from TblState",cn); cn.Open(); SqlDataReader dr; try { dr = cmd.ExecuteReader(); while (dr.Read()) { SelectState.Items.Add(dr["State"].ToString()); } } catch (Exception ex) { MessageBox.Show(ex.Message); } finally { cn.Close(); } //To fill the Cities from City Table cn1 = new SqlConnection(@"Data Source=Nick-PC\SQLEXPRESS;Initial Catalog=AutoDB;Integrated Security=True"); cmd1 = new SqlCommand("SELECT * FROM TblCity", cn); cn.Open(); SqlDataReader ds; try { ds = cmd1.ExecuteReader(); while (ds.Read()) { SelectCity.Items.Add(ds["City"].ToString()); } } catch (Exception ex) { MessageBox.Show(ex.Message); } finally { cn1.Close(); } // To fill the Data in the Pincode from the City Table cn2 = new SqlConnection(@"Data Source=Nick-PC\SQLEXPRESS;Initial Catalog=AutoDB;Integrated Security=True"); cmd2 = new SqlCommand("SELECT (Pincode) FROM TblCity ", cn2); cn2.Open(); SqlDataReader dm; try { dm = cmd2.ExecuteReader(); while (dm.Read()) { SelectPinCode.Items.Add(dm["Pincode"].ToString()); } } catch (Exception ex) { MessageBox.Show(ex.Message); } finally { cn2.Close(); } its kinda Big, i am doing the same steps for all the combo-box, but is there a way i can merge it in a simple way.

    Read the article

  • Can I split a single SQL 2008 DB Table into multiple filegroups, based on a discriminator column?

    - by Pure.Krome
    Hi folks, I've got a SQL Server 2008 R2 database which has a number of tables. Two of these tables contains a lot of large data .. mainly because one of them is VARBINARY(MAX) and the sister table is GEOGRAPHY. (Why two tables? Read Below if you're interested***) The data in these tables are geospatial shapes, such as zipcode boundaries. Now, the first 70K odd rows are for DataType = 1 the rest 5mil rows are for DataType = 2 Now, is it possible to split the table data into two files? so all rows that are for DataType != 2 goes into File_A and DataType = 2 goes into File_B? This way, when I backup the DB, I can skip adding File_B so my download is waaaaay smaller? Is this possible? I guessing you might be thinking - why not keep them as TWO extra tables? Mainly because in the code, the data is conceptually the same .. it's just happens that I want to split the storage of this model data. It really messes up my model if I now how two aggregates in my model, instead of one. ***Entity Framework doesn't like Tables with GEOGRAPHY, so i have to create a new table which transforms the GEOGRAPHY to VARBINARY, and then drop that into EF.

    Read the article

  • Adding a clustered index to a SQL table: what dangers exist for a live production system?

    - by MoSlo
    Right, keep in mind i need to describe this by abstracting all possible confidential info: I've been put in charge of a 10-year old transactional system of which the majority business logic is implemented at database level (triggers, stored procedures etc). Win2000 server, MSSQL 2000 Enterprise. No immediate plans for replacing/updating the system are being considered :( The core process is a program that executes transactions - specifically, it executes a stored procedure with various parameters, lets call it sp_ProcessTrans. The program executes the stored procedure at asynchronous intervals. By itself, things work fine. But there are 30 instances of this program on remotely located workstations, all of them asynchronously executing sp_ProcessTrans and then retrieving data from the SQL server (execution is pretty regular - ranging 0 to 60 times a minute, depending on what items the program instance is responsible for) . Performance of the system has dropped considerably with 10 yrs of data growth: the reason is the deadlocks and specifically deadlock wait times. The deadlock is on the Employee table. I have discovered: In sp_ProcessTrans' execution, it selects from an Employee table 7 times (dont ask) The select is done on a field that is NOT the primary key No index exists on this field. Thus a table scan is performed. 7 times. per transaction So the reason for deadlocks is clear. I created a non-unique ordered clustered index on the field (field looks good, almost unique, NUM(7), very rarely changes). Immediate improvement in the test environment. The problem is that i cannot simulate the deadlocks in a test environment (I'd need 30 workstations; i'd need to simulate 'realistic' activity on those stations, so visualization is out). I need to know if i must schedule downtime. Creating an index shouldn't be a risky operation for MSSQL, but is there any danger (data corruption in transactions/select statements/extra wait time etc) to create this field index on the production database while the transactions are still taking place? (although i can select a time when transactions are fairly quiet through the 30 stations) Are there any hidden dangers i'm not seeing (not looking forward to needing to restore the DB if something goes wrong, restoring would take a lot of time with 10yrs of data).

    Read the article

  • How do I add a where filter using the original Linq-to-SQL object in the following scenario

    - by GenericTypeTea
    I am performing a select query using the following Linq expression: Table<Tbl_Movement> movements = context.Tbl_Movement; var query = from m in movements select new MovementSummary { Id = m.DocketId, Created = m.DateTimeStamp, CreatedBy = m.Tbl_User.FullName, DocketNumber = m.DocketNumber, DocketTypeDescription = m.Ref_DocketType.DocketType, DocketTypeId = m.DocketTypeId, Site = new Site() { Id = m.Tbl_Site.SiteId, FirstLine = m.Tbl_Site.FirstLine, Postcode = m.Tbl_Site.Postcode, SiteName = m.Tbl_Site.SiteName, TownCity = m.Tbl_Site.TownCity, Brewery = new Brewery() { Id = m.Tbl_Site.Ref_Brewery.BreweryId, BreweryName = m.Tbl_Site.Ref_Brewery.BreweryName }, Region = new Region() { Description = m.Tbl_Site.Ref_Region.Description, Id = m.Tbl_Site.Ref_Region.RegionId } } }; I am also passing in an IFilter class into the method where this select is performed. public interface IJobFilter { int? PersonId { get; set; } int? RegionId { get; set; } int? SiteId { get; set; } int? AssetId { get; set; } } How do I add these where parameters into my SQL expression? Preferably I'd like this done in another method as the filtering will be re-used across multiple repositories. Unfortunately when I do query.Where it has become an IQueryable<MovementSummary>. I'm assuming it has become this as I'm returning an IEnumerable<MovementSummary>. I've only just started learning LINQ, so be gentle.

    Read the article

  • Large Product catalog with statistics - alternatives to Sql Server?

    - by Eric P
    I am building UI for a large product catalog (millions of products). I am using Sql Server, FreeText search and ASP.NET MVC. Tables are normalized and indexed. Most queries take less then a second to return. The issue is this. Let's say user does the search by keyword. On search results page I need to display/query for: First 20 matching products (paged, sorted) Total count of matching products for paging List of stores only of matching products List of brands only of matching products List of colors only of matching products Each query takes about .5 to 1 seconds. Altogether it is like 5 seconds. I would like to get the whole page to load under 1 second. There are several approaches: Optimize queries even more. I already spent a lot of time on this one, so not sure it can be pushed further. Load products first, then load the rest of the information using AJAX. More like a workaround. Will need to revise UI. Re-organize data to be more Report friendly. Already aggregated a lot of fields. I checked out several similar sites. For ex. zappos.com. Not only they display the same information as I would like in under 1 second, but they also include statistics (number of results in each category). The following is the search for keyword "white" http://www.zappos.com/white How do sites like zappos, amazon make their results, filters and stats appear almost instantly?

    Read the article

  • Is there a way to create subdatabases as a kind of subfolders in sql server?

    - by user193655
    I am creating an application where there is main DB and where other data is stored in secondary databases. The secondary databases follow a "plugin" approach. I use SQL Server. A simple installation of the application will just have the mainDB, while as an option one can activate more "plug-ins" and for every plug-in there will be a new database. Now why I made this choice is because I have to work with an exisiting legacy system and this is the smartest thing I could figure to implement the plugin system. MainDB and Plugins DB have exactly the same schema (basically Plugins DB have some "special content", some important data that one can use as a kind of template - think to a letter template for example - in the application). Plugin DBs are so used in readonly mode, they are "repository of content". The "smart" thing is that the main application can also be used by "plugin writers", they just write a DB inserting content, and by making a backup of the database they creaetd a potential plugin (this is why all DBs has the same schema). Those plugins DB are downloaded from internet as there is a content upgrade available, every time the full PlugIn DB is destroyed and a new one with the same name is creaetd. This is for simplicity and even because the size of this DBs is generally small. Now this works, anyway I would prefer to organize the DBs in a kind of Tree structure, so that I can force the PlugIn DBs to be "sub-DBs" of the main application DB. As a workaround I am thinking of using naming rules, like: ApplicationDB (for the main application DB) ApplicationDB_PlugIn_N (for the N-th plugin DB) When I search for plugin 1 I try to connect to ApplicationDB_PlugIn_1, if I don't find the DB i raise an error. This situation can happen for example if som DBA renamed ApplicationDB_Plugin_1. So since those Plugin DBs are really dependant on ApplicationDB only I was trying to "do the subfolder trick". Can anyone suggest a way to do this? Can you comment on this self-made plugin approach I decribed above?

    Read the article

  • i got sql syntax error when i debug my application..

    - by newBie
    hi..i want to update my database using formatsqlparam..but when i debug it, it has error saying "Incorrect syntax near ','." this is my code: Dim sql2 As String = "update infoHotel set nameHotel = N" & FormatSqlParam(hotel) & _ ", knownAs1 = N" & FormatSqlParam(KnownAs(0)) & _ ", knownAs2 = N" & FormatSqlParam(KnownAs(1)) & _ ", knownAs3 = N" & FormatSqlParam(KnownAs(2)) & _ ", knownAs4 = N" & FormatSqlParam(KnownAs(3)) & _ ", streetAddress = N" & FormatSqlParam(StreetAddress) & _ ", locality = N" & FormatSqlParam(Locality) & _ ", postalCode = N" & FormatSqlParam(PostalCode) & _ ", country = N" & FormatSqlParam(Country) & _ ", addressFull = N" & FormatSqlParam(address) & _ ", tel = N" & FormatSqlParam(contact) & "," Dim objCommand3 As New SqlCommand(sql2, conn) objCommand3.ExecuteNonQuery() maybe i missing some syntax..but couldnt find where it is..hope somebody can help..thnks in advance..im using vb.net and sql

    Read the article

  • Best Performing ORM for .NET

    - by steve_c
    I'm curious if anyone has done any performance comparisons with any or all of the main players in the .NET ORM space. Specifically I'm interested in comparisons between the following: Linq to SQL NHibernate LLBL Gen Entity Framework Though it seems people don't really consider Linq to SQL a true ORM, I am still including it in this list. Some performance metrics would be nice to see.

    Read the article

  • Peoplesoft queries - performance

    - by DBa
    Hi, I'm facing a problem with PeopleSoft queries (using Oracle backend database): when a rather complex query involving multiple records is set off by a user, PS does an enforced join of security records, thus producing SQL like this: select .... from ps_job a, PS_EMPL_SRCQRY a1, ps_table2 b, ps_sec_rcd2 b1, ps_table3 c, ps_sec_rcd3 c1 where (...security joins a-a1, b-b1, c-c1...) and (...joins of a, b and c...) and a.setid_dept = 'XYZ'; (let's assume the last condition has a high selectivity and there is an index on the column) Obviously, due to the arrangement of the conditions, first a huge join is created, written to the temp segment, and when the last condition is finally applied, only a small subset is selected. A query formulated in this way is very likely to hit the preset timeout of the APPSRV, and even of the QRYSRV. When writing the query manually, I would rather move the most selective condition to the start, thus limiting the amount of the data being handled, to a considerable level. Any ideas on how to make PS behave like this? Actually, already rewriting "Oracle-styled" SQL to ANSI SQL seems to accelerate the queries - however, PS writes Oracle-style queries... Thanks in advance DBa

    Read the article

  • do functions in sql server have different permissions rules?

    - by jcollum
    Here's the situation. I'm writing an automated test that walks the list of dependencies for a proc and determines if an acct has rights for all of the dependent objects. My code looks like this: exec sp_depends 'the_proc_name' -- run this query on the results of sp_depends: select case when exists ( select * from sys.database_permissions dp where grantee_principal_id=USER_ID('TheAccount') and major_id=object_id('dbo.theDependentObject') and minor_id=0 and state_desc = 'GRANT') then 'true' else 'false' end; It all seems to be working fine, but there's a hiccup when it encounters a function. I have one case where TheAccount doesn't have rights to a function (the query above returns false). However the proc that calls the function in question runs fine when running under TheAccount. So there's either something wrong with my test code or functions have special permission behavior in SQL-Server that I'm not aware of. Should I change the code to only search for 'DENY' instead of 'GRANT'? Do functions that are called in procs inherit the permissions of the calling proc except when the execute rights are explicitly denied? Does my code suck?

    Read the article

  • ( Sql Server 2005 C#.Net ) - I want just the insert query for a temp table.

    - by John Stephen
    Hi..I am using C#.Net and Sql Server ( Windows Application ). I had created a temporary table. When a button is clicked, temporary table (#tmp_emp_details) is created. I am having another button called "insert Values" and also 5 textboxes. The values that are entered in the textbox are used and whenever com.ExecuteNonQuery(); line comes, it throws an error message called "Invalid object name '#tbl_emp_answer'.". Below is the set of code..Please give me a solution. Code for insert (in insert value button): private void btninsertvalues_Click(object sender, EventArgs e) { username = txtusername.Text; examloginid = txtexamloginid.Text; question = txtquestion.Text; answer = txtanswer.Text; useranswer = txtanswer.Text; SqlConnection con = new SqlConnection("Data Source=.;Initial Catalog=tempdb;Integrated Security=True;"); SqlCommand com = new SqlCommand("Insert into #tbl_emp_answer values('"+username+"','"+examloginid+"','"+question+"','"+answer+"','"+useranswer+"')", con); con.Open(); com.ExecuteNonQuery(); con.Close(); }

    Read the article

  • Is this method a good aproach to get SQL values from C#?

    - by MadBoy
    I have this little method that i use to get stuff from SQL. I either call it with varSearch = "" or varSearch = "something". I would like to know if having method written this way is best or would it be better to split it into two methods (by overloading), or maybe i could somehow parametrize whole WHERE clausule? private void sqlPobierzKontrahentDaneKlienta(ListView varListView, string varSearch) { varListView.BeginUpdate(); varListView.Items.Clear(); string preparedCommand; if (varSearch == "") { preparedCommand = @" SELECT t1.[KlienciID], CASE WHEN t2.[PodmiotRodzaj] = 'Firma' THEN t2.[PodmiotFirmaNazwa] ELSE t2.[PodmiotOsobaNazwisko] + ' ' + t2.[PodmiotOsobaImie] END AS 'Nazwa' FROM [BazaZarzadzanie].[dbo].[Klienci] t1 INNER JOIN [BazaZarzadzanie].[dbo].[Podmioty] t2 ON t1.[PodmiotID] = t2.[PodmiotID] ORDER BY t1.[KlienciID]"; } else { preparedCommand = @" SELECT t1.[KlienciID], CASE WHEN t2.[PodmiotRodzaj] = 'Firma' THEN t2.[PodmiotFirmaNazwa] ELSE t2.[PodmiotOsobaNazwisko] + ' ' + t2.[PodmiotOsobaImie] END AS 'Nazwa' FROM [BazaZarzadzanie].[dbo].[Klienci] t1 INNER JOIN [BazaZarzadzanie].[dbo].[Podmioty] t2 ON t1.[PodmiotID] = t2.[PodmiotID] WHERE t2.[PodmiotOsobaNazwisko] LIKE @searchValue OR t2.[PodmiotFirmaNazwa] LIKE @searchValue OR t2.[PodmiotOsobaImie] LIKE @searchValue ORDER BY t1.[KlienciID]"; } using (var varConnection = Locale.sqlConnectOneTime(Locale.sqlDataConnectionDetails)) using (SqlCommand sqlQuery = new SqlCommand(preparedCommand, varConnection)) { sqlQuery.Parameters.AddWithValue("@searchValue", "%" + varSearch + "%"); using (SqlDataReader sqlQueryResult = sqlQuery.ExecuteReader()) if (sqlQueryResult != null) { while (sqlQueryResult.Read()) { string varKontrahenciID = sqlQueryResult["KlienciID"].ToString(); string varKontrahent = sqlQueryResult["Nazwa"].ToString(); ListViewItem item = new ListViewItem(varKontrahenciID, 0); item.SubItems.Add(varKontrahent); varListView.Items.AddRange(new[] {item}); } } } varListView.EndUpdate(); }

    Read the article

  • How to get values from SQL query made by php?

    - by Ole Jak
    So I made a query like this global $connection; $query = "SELECT * FROM streams "; $streams_set = mysql_query($query, $connection); confirm_query($streams_set); in my DB there are filds ID, UID, SID, TIME (all INT type exept time) So I am triing to print query relult into form <form> <select class="multiselect" multiple="multiple" name="SIDs"> <?php global $connection; $query = "SELECT * FROM streams "; $streams_set = mysql_query($query, $connection); confirm_query($streams_set); $streams_count = mysql_num_rows($streams_set); for ($count=1; $count <= $streams_count; $count++) { echo "<option value=\"{$count}\""; echo ">{$count}</option>"; } ?> </select> <br/> <input type="submit" value="Submit Form"/> </form> How to print out as "option" "values" SID's from my sql query?

    Read the article

  • Implementing search functionality with multiple optional parameters against database table.

    - by quarkX
    Hello, I would like to check if there is a preferred design pattern for implementing search functionality with multiple optional parameters against database table where the access to the database should be only via stored procedures. The targeted platform is .Net with SQL 2005, 2008 backend, but I think this is pretty generic problem. For example, we have customer table and we want to provide search functionality to the UI for different parameters, like customer Type, customer State, customer Zip, etc., and all of them are optional and can be selected in any combinations. In other words, the user can search by customerType only or by customerType, customerZIp or any other possible combinations. There are several available design approaches, but all of them have some disadvantages and I would like to ask if there is a preferred design among them or if there is another approach. Generate sql where clause sql statement dynamically in the business tier, based on the search request from the UI, and pass it to a stored procedure as parameter. Something like @Where = ‘where CustomerZip = 111111’ Inside the stored procedure generate dynamic sql statement and execute it with sp_executesql. Disadvantage: dynamic sql, sql injection Implement a stored procedure with multiple input parameters, representing the search fields from the UI, and use the following construction for selecting the records only for the requested fields in the where statement. WHERE (CustomerType = @CustomerType OR @CustomerType is null ) AND (CustomerZip = @CustomerZip OR @CustomerZip is null ) AND ………………………………………… Disadvantage: possible performance issue for the sql. 3.Implement separate stored procedure for each search parameter combinations. Disadvantage: The number of stored procedures will increase rapidly with the increase of the search parameters, repeated code.

    Read the article

  • performance issue with a website.

    - by pradeep
    Hi, I have a website .. [SPAM LINK REMOVED]. When you see pages like [SPAM LINK REMOVED] and [SPAM LINK REMOVED]. These pages load a bit slowly in FF. Please see if the speed is reasonable or speed needs to improve and in IE 6 it takes ages to load the website. Please let me know what all should i do to increase the performance of website. I tried all the options given by firebug. But nothing much helped. If any1 can let me know where all i should take care to increase the performance it will be very helpful.

    Read the article

  • Benchmarks for Single and MultiThreaded programs

    - by user280848
    Hi I am trying to compare the performance of Single and Multithreaded Java programs. Are there any single thread benchmarks which are available which I could then use and convert to their multithreaded version and compare the performance. Could anybody guide me as to what kind of programs(not very small) are suitable for this empirical comparison. Thanks in advance

    Read the article

  • SQL SERVER Retrieve and Explore Database Backup without Restoring Database Idera virtual database

    I recently downloaded Ideras SQL virtual database, and tested it. There are a few things about this tool which caught my attention.My ScenarioIt is quite common in real life that sometimes observing or retrieving older data is necessary; however, it had changed as time passed by. The full database backup was 40 GB in size, [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Microsoft 2010 Product Tour

    - by dmccollough
    Randy Walker, Co-Founder of the Northwest Arkansas .Net User Group and Microsoft MVP has arranged for a couple of Microsoft experts, Sarika Calla Team Lead on the IDE Team and Kevin Halverson to give presentations on newly released Visual Studio 2010.   June 1 – Bentonville, Arkansas Wal-Mart .Net User Group June 1 – Rogers, Arkansas Northwest Arkansas SQL Server User Group (lunch meeting) June 1 – Springdale, Arkansas Tyson devLoop June 1 – Fayetteville, Arkansas Northwest Arkansas .Net User Group June 2 – Fort Smith, Arkansas Datatronics June 2 – Little Rock, Arkansas Little Rock .Net User Group June 3 – Fort Worth, Texas Fort Worth .Net User Group   Please contact Randy Walker with questions at [email protected].

    Read the article

  • Comparing LINQ to SQL vs the classic SqlCommand

    tweetmeme_url = 'http://alpascual.com/blog/comparing-linq-to-sql-vs-the-classic-sqlcommand/';tweetmeme_source = 'alpascual';When you are coming from using SqlCommand and SqlConnection is difficult to move to another library for your database needs. For those people still in the limbo to make the decision to move to another DAL, here is a comparison to help you see the light or to move away for ever.   How to do a select query using SqlCommand: 1: SqlConnection myConnection = new...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • Comparing LINQ to SQL vs the classic SqlCommand

    tweetmeme_url = 'http://alpascual.com/blog/comparing-linq-to-sql-vs-the-classic-sqlcommand/';tweetmeme_source = 'alpascual';When you are coming from using SqlCommand and SqlConnection is difficult to move to another library for your database needs. For those people still in the limbo to make the decision to move to another DAL, here is a comparison to help you see the light or to move away for ever.   How to do a select query using SqlCommand: 1: SqlConnection myConnection = new...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • UK SQL Server User Group Events (June)

    There are two events of note for the SQL Server User Group in June.  The first is a Live Meeting event with myself on 04.06.2009.  I am going to be looking at how to integrate Data Mining into your BI solution.  I will be looking at putting DM into SSIS, SSAS and SSRS.  It will be very demo oriented.  You can register for the event here The second event is an event at Microsoft Reading on 10.06.2009.  The evening will be a BI/Data Mining event.  Chris Webb and myself are organizing it and  we want speakers.  We would love to see new faces up there telling us about their BI/DM solutions/Tips and Tricks.  If you want to speak at the event then let me or Chris know.  If you just want to attend then you can register here.

    Read the article

  • SQL SERVER Enumerations in Relational Database Best Practice

    This article has been submitted by Marko Parkkola, Data systems designer at Saarionen Oy, Finland. Marko is excellent developer and always thinking at next level. You can read his earlier comment which created very interesting discussion here: SQL SERVER- IF EXISTS(Select null from table) vs IF EXISTS(Select 1 from table). I must express my special [...]...Did you know that DotNetSlackers also publishes .net articles written by top known .net Authors? We already have over 80 articles in several categories including Silverlight. Take a look: here.

    Read the article

  • TSQL Quiz 2011 on beyondrelational.com

    - by Jalpesh P. Vadgama
    One of the my friend Jacob Sebastian running a SQL Server TSQL quiz on his site beyondrelational.com. This is a great opportunity to learn TSQL and win great price Like Apple IPad and other lots of cool stuff. So if you are expert and if you learning TSQL then its a great way to test your knowledge. For whole month of march selected quiz master will ask a question and you have to answer all this question day by day and at the end of month you will have great chance to win Apple Ipad. For more details you can visit following link: http://beyondrelational.com/quiz/SQLServer/TSQL/2011/default.aspx Hope you liked it.Stay tuned for more..

    Read the article

  • Find the occurrence of word/character in SQL column with wildcard character - PATINDEX

    - by Vipin
    CharIndex and PatIndex both can be used to determine the presence of character or string within sql column data. Both returns the starting position of the first occurrence of the character/word within expression. However, one major difference between CharIndex and PatIndex is that later allows the use of wild card characters while searching for character or word within column data. Also, Patindex is useful for searching within Text datatype. Allowed wild card characters are % and _ . " % "  - use it for any number of characters " _ "  - use it for a single character. Syntax PATINDEX('%pattern%', string_expression) Note - it's mandatory to include pattern within %% characters. returns starting position of occurrence of pattern, if found. returns 0, if not found returns NULL , if either pattern or string_expression is null. Example SELECT fldname FROM tblUsers WHERE PatIndex('%v_pin%', fldname) > 0

    Read the article

< Previous Page | 502 503 504 505 506 507 508 509 510 511 512 513  | Next Page >